2024-05-30 21:53:13,397 - mmdet - INFO - Environment info: ------------------------------------------------------------ sys.platform: linux Python: 3.9.19 (main, May 6 2024, 19:43:03) [GCC 11.2.0] CUDA available: True GPU 0,1,2,3,4,5,6,7: NVIDIA A100-SXM4-80GB CUDA_HOME: /mnt/petrelfs/share/cuda-11.7/ NVCC: Cuda compilation tools, release 11.7, V11.7.99 GCC: gcc (GCC) 7.3.0 PyTorch: 1.12.0+cu113 PyTorch compiling details: PyTorch built with: - GCC 9.3 - C++ Version: 201402 - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v2.6.0 (Git Hash 52b5f107dd9cf10910aaa19cb47f3abf9b349815) - OpenMP 201511 (a.k.a. OpenMP 4.5) - LAPACK is enabled (usually provided by MKL) - NNPACK is enabled - CPU capability usage: AVX2 - CUDA Runtime 11.3 - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86 - CuDNN 8.3.2 (built against CUDA 11.5) - Magma 2.5.2 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.3, CUDNN_VERSION=8.3.2, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.12.0, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=OFF, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, TorchVision: 0.13.0+cu113 OpenCV: 4.9.0 MMCV: 1.7.0 MMCV Compiler: GCC 7.3 MMCV CUDA Compiler: 11.7 MMDetection: 2.25.3+c8d02d2 ------------------------------------------------------------ 2024-05-30 21:53:14,899 - mmdet - INFO - Distributed training: True 2024-05-30 21:53:16,503 - mmdet - INFO - Config: model = dict( type='MaskRCNN', backbone=dict( type='PIIPThreeBranch', n_points=4, deform_num_heads=16, cffn_ratio=0.25, deform_ratio=0.5, with_cffn=True, interact_attn_type='deform', interaction_drop_path_rate=0.4, branch1=dict( real_size=672, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=24, embed_dim=1024, num_heads=16, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.4, init_scale=1.0, with_fpn=False, interaction_indexes=[[0, 1], [2, 3], [4, 5], [6, 7], [8, 9], [10, 11], [12, 13], [14, 15], [16, 17], [18, 19], [20, 21], [22, 23]], pretrained='./pretrained/deit_3_large_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[ 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28 ], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch2=dict( real_size=896, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=768, num_heads=12, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.15, init_scale=1.0, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_3_base_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch3=dict( real_size=1568, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=384, num_heads=6, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.05, init_scale=1.0, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_3_small_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True))), neck=dict( type='FPN', in_channels=[1024, 1024, 1024, 1024], out_channels=256, num_outs=5), rpn_head=dict( type='RPNHead', in_channels=256, feat_channels=256, anchor_generator=dict( type='AnchorGenerator', scales=[8], ratios=[0.5, 1.0, 2.0], strides=[4, 8, 16, 32, 64]), bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[1.0, 1.0, 1.0, 1.0]), loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), roi_head=dict( type='StandardRoIHead', bbox_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=7, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), bbox_head=dict( type='Shared2FCBBoxHead', in_channels=256, fc_out_channels=1024, roi_feat_size=7, num_classes=80, bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[0.1, 0.1, 0.2, 0.2]), reg_class_agnostic=False, loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), mask_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=14, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), mask_head=dict( type='FCNMaskHead', num_convs=4, in_channels=256, conv_out_channels=256, num_classes=80, loss_mask=dict( type='CrossEntropyLoss', use_mask=True, loss_weight=1.0))), train_cfg=dict( rpn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.7, neg_iou_thr=0.3, min_pos_iou=0.3, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=256, pos_fraction=0.5, neg_pos_ub=-1, add_gt_as_proposals=False), allowed_border=-1, pos_weight=-1, debug=False), rpn_proposal=dict( nms_pre=2000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.5, neg_iou_thr=0.5, min_pos_iou=0.5, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=512, pos_fraction=0.25, neg_pos_ub=-1, add_gt_as_proposals=True), mask_size=28, pos_weight=-1, debug=False)), test_cfg=dict( rpn=dict( nms_pre=1000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( score_thr=0.05, nms=dict(type='nms', iou_threshold=0.5), max_per_img=100, mask_thr_binary=0.5))) dataset_type = 'CocoDataset' data_root = 'data/coco/' img_norm_cfg = dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True) train_pipeline = [ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1568, 941), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ] test_pipeline = [ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1568, 941), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ] data = dict( samples_per_gpu=2, workers_per_gpu=2, train=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_train2017.json', img_prefix='data/coco/train2017/', pipeline=[ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1568, 941), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='DefaultFormatBundle'), dict( type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ]), val=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1568, 941), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ]), test=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1568, 941), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ])) evaluation = dict(metric=['bbox', 'segm'], interval=1, save_best=None) optimizer = dict( type='AdamW', lr=0.0001, betas=(0.9, 0.999), weight_decay=0.05, constructor='CustomLayerDecayOptimizerConstructorMMDet', paramwise_cfg=dict( num_layers=24, layer_decay_rate=0.85, skip_stride=[2, 2])) optimizer_config = dict(grad_clip=None) lr_config = dict( policy='step', warmup='linear', warmup_iters=500, warmup_ratio=0.001, step=[8, 11]) runner = dict(type='EpochBasedRunner', max_epochs=12) checkpoint_config = dict(interval=1, deepspeed=True, max_keep_ckpts=1) log_config = dict(interval=50, hooks=[dict(type='TextLoggerHook')]) custom_hooks = [dict(type='ToBFloat16HookMMDet', priority=49)] dist_params = dict(backend='nccl') log_level = 'INFO' load_from = None resume_from = None workflow = [('train', 1)] opencv_num_threads = 0 mp_start_method = 'fork' auto_scale_lr = dict(enable=False, base_batch_size=16) deepspeed = True deepspeed_config = 'zero_configs/adam_zero1_bf16.json' custom_imports = dict( imports=['mmdet.mmcv_custom'], allow_failed_imports=False) work_dir = './work_dirs/mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16' auto_resume = True gpu_ids = range(0, 8) 2024-05-30 21:53:20,146 - mmdet - INFO - Set random seed to 131810781, deterministic: False 2024-05-30 21:53:29,060 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-30 21:53:30,842 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-30 21:53:33,246 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-30 21:54:49,884 - mmdet - INFO - initialize FPN with init_cfg {'type': 'Xavier', 'layer': 'Conv2d', 'distribution': 'uniform'} 2024-05-30 21:54:50,392 - mmdet - INFO - initialize RPNHead with init_cfg {'type': 'Normal', 'layer': 'Conv2d', 'std': 0.01} 2024-05-30 21:54:50,457 - mmdet - INFO - initialize Shared2FCBBoxHead with init_cfg [{'type': 'Normal', 'std': 0.01, 'override': {'name': 'fc_cls'}}, {'type': 'Normal', 'std': 0.001, 'override': {'name': 'fc_reg'}}, {'type': 'Xavier', 'distribution': 'uniform', 'override': [{'name': 'shared_fcs'}, {'name': 'cls_fcs'}, {'name': 'reg_fcs'}]}] Name of parameter - Initialization information backbone.w1 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w2 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w3 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.pos_embed - torch.Size([1, 196, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.patch_embed.proj.weight - torch.Size([1024, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.patch_embed.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.pos_embed - torch.Size([1, 196, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.patch_embed.proj.weight - torch.Size([768, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.patch_embed.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.pos_embed - torch.Size([1, 196, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.patch_embed.proj.weight - torch.Size([384, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.patch_embed.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.0.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.0.weight - torch.Size([1024, 768, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.0.weight - torch.Size([1024, 384, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.0.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.3.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.3.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn2.0.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn2.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.0.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.1.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.2.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.3.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.0.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.1.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.2.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.3.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN rpn_head.rpn_conv.weight - torch.Size([256, 256, 3, 3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_conv.bias - torch.Size([256]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.weight - torch.Size([3, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.bias - torch.Size([3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.weight - torch.Size([12, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.bias - torch.Size([12]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_cls.weight - torch.Size([81, 1024]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_cls.bias - torch.Size([81]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_reg.weight - torch.Size([320, 1024]): NormalInit: mean=0, std=0.001, bias=0 roi_head.bbox_head.fc_reg.bias - torch.Size([320]): NormalInit: mean=0, std=0.001, bias=0 roi_head.bbox_head.shared_fcs.0.weight - torch.Size([1024, 12544]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.0.bias - torch.Size([1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.1.weight - torch.Size([1024, 1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.1.bias - torch.Size([1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.mask_head.convs.0.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.1.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.2.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.3.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.upsample.weight - torch.Size([256, 256, 2, 2]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.upsample.bias - torch.Size([256]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.conv_logits.weight - torch.Size([80, 256, 1, 1]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.conv_logits.bias - torch.Size([80]): Initialized by user-defined `init_weights` in FCNMaskHead 2024-05-30 21:55:07,722 - mmdet - INFO - {'num_layers': 24, 'layer_decay_rate': 0.85, 'skip_stride': [2, 2]} 2024-05-30 21:55:07,722 - mmdet - INFO - Build LayerDecayOptimizerConstructor 0.850000 - 26 2024-05-30 21:55:07,736 - mmdet - INFO - Param groups = { "layer_25_decay": { "param_names": [ "backbone.w1", "backbone.w2", "backbone.w3", "backbone.interactions.0.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.merge_branch1.0.weight", "backbone.merge_branch1.3.weight", "backbone.merge_branch2.0.weight", "backbone.merge_branch2.3.weight", "backbone.merge_branch3.0.weight", "backbone.merge_branch3.3.weight", "backbone.fpn1.0.weight", "backbone.fpn1.3.weight", "backbone.fpn2.0.weight", "neck.lateral_convs.0.conv.weight", "neck.lateral_convs.1.conv.weight", "neck.lateral_convs.2.conv.weight", "neck.lateral_convs.3.conv.weight", "neck.fpn_convs.0.conv.weight", "neck.fpn_convs.1.conv.weight", "neck.fpn_convs.2.conv.weight", "neck.fpn_convs.3.conv.weight", "rpn_head.rpn_conv.weight", "rpn_head.rpn_cls.weight", "rpn_head.rpn_reg.weight", "roi_head.bbox_head.fc_cls.weight", "roi_head.bbox_head.fc_reg.weight", "roi_head.bbox_head.shared_fcs.0.weight", "roi_head.bbox_head.shared_fcs.1.weight", "roi_head.mask_head.convs.0.conv.weight", "roi_head.mask_head.convs.1.conv.weight", "roi_head.mask_head.convs.2.conv.weight", "roi_head.mask_head.convs.3.conv.weight", "roi_head.mask_head.upsample.weight", "roi_head.mask_head.conv_logits.weight" ], "lr_scale": 1.0, "lr": 0.0001, "weight_decay": 0.05 }, "layer_0_decay": { "param_names": [ "backbone.branch1.pos_embed", "backbone.branch1.patch_embed.proj.weight", "backbone.branch2.pos_embed", "backbone.branch2.patch_embed.proj.weight", "backbone.branch3.pos_embed", "backbone.branch3.patch_embed.proj.weight" ], "lr_scale": 0.017197809852207896, "lr": 1.7197809852207897e-06, "weight_decay": 0.05 }, "layer_0_no_decay": { "param_names": [ "backbone.branch1.patch_embed.proj.bias", "backbone.branch2.patch_embed.proj.bias", "backbone.branch3.patch_embed.proj.bias" ], "lr_scale": 0.017197809852207896, "lr": 1.7197809852207897e-06, "weight_decay": 0.0 }, "layer_1_no_decay": { "param_names": [ "backbone.branch1.blocks.0.gamma_1", "backbone.branch1.blocks.0.gamma_2", "backbone.branch1.blocks.0.norm1.weight", "backbone.branch1.blocks.0.norm1.bias", "backbone.branch1.blocks.0.attn.qkv.bias", "backbone.branch1.blocks.0.attn.proj.bias", "backbone.branch1.blocks.0.norm2.weight", "backbone.branch1.blocks.0.norm2.bias", "backbone.branch1.blocks.0.mlp.fc1.bias", "backbone.branch1.blocks.0.mlp.fc2.bias" ], "lr_scale": 0.02023271747318576, "lr": 2.023271747318576e-06, "weight_decay": 0.0 }, "layer_1_decay": { "param_names": [ "backbone.branch1.blocks.0.attn.qkv.weight", "backbone.branch1.blocks.0.attn.proj.weight", "backbone.branch1.blocks.0.mlp.fc1.weight", "backbone.branch1.blocks.0.mlp.fc2.weight" ], "lr_scale": 0.02023271747318576, "lr": 2.023271747318576e-06, "weight_decay": 0.05 }, "layer_2_no_decay": { "param_names": [ "backbone.branch1.blocks.1.gamma_1", "backbone.branch1.blocks.1.gamma_2", "backbone.branch1.blocks.1.norm1.weight", "backbone.branch1.blocks.1.norm1.bias", "backbone.branch1.blocks.1.attn.qkv.bias", "backbone.branch1.blocks.1.attn.proj.bias", "backbone.branch1.blocks.1.norm2.weight", "backbone.branch1.blocks.1.norm2.bias", "backbone.branch1.blocks.1.mlp.fc1.bias", "backbone.branch1.blocks.1.mlp.fc2.bias", "backbone.branch2.blocks.0.gamma_1", "backbone.branch2.blocks.0.gamma_2", "backbone.branch2.blocks.0.norm1.weight", "backbone.branch2.blocks.0.norm1.bias", "backbone.branch2.blocks.0.attn.qkv.bias", "backbone.branch2.blocks.0.attn.proj.bias", "backbone.branch2.blocks.0.norm2.weight", "backbone.branch2.blocks.0.norm2.bias", "backbone.branch2.blocks.0.mlp.fc1.bias", "backbone.branch2.blocks.0.mlp.fc2.bias", "backbone.branch3.blocks.0.gamma_1", "backbone.branch3.blocks.0.gamma_2", "backbone.branch3.blocks.0.norm1.weight", "backbone.branch3.blocks.0.norm1.bias", "backbone.branch3.blocks.0.attn.qkv.bias", "backbone.branch3.blocks.0.attn.proj.bias", "backbone.branch3.blocks.0.norm2.weight", "backbone.branch3.blocks.0.norm2.bias", "backbone.branch3.blocks.0.mlp.fc1.bias", "backbone.branch3.blocks.0.mlp.fc2.bias" ], "lr_scale": 0.023803197027277366, "lr": 2.380319702727737e-06, "weight_decay": 0.0 }, "layer_2_decay": { "param_names": [ "backbone.branch1.blocks.1.attn.qkv.weight", "backbone.branch1.blocks.1.attn.proj.weight", "backbone.branch1.blocks.1.mlp.fc1.weight", "backbone.branch1.blocks.1.mlp.fc2.weight", "backbone.branch2.blocks.0.attn.qkv.weight", "backbone.branch2.blocks.0.attn.proj.weight", "backbone.branch2.blocks.0.mlp.fc1.weight", "backbone.branch2.blocks.0.mlp.fc2.weight", "backbone.branch3.blocks.0.attn.qkv.weight", "backbone.branch3.blocks.0.attn.proj.weight", "backbone.branch3.blocks.0.mlp.fc1.weight", "backbone.branch3.blocks.0.mlp.fc2.weight" ], "lr_scale": 0.023803197027277366, "lr": 2.380319702727737e-06, "weight_decay": 0.05 }, "layer_3_no_decay": { "param_names": [ "backbone.branch1.blocks.2.gamma_1", "backbone.branch1.blocks.2.gamma_2", "backbone.branch1.blocks.2.norm1.weight", "backbone.branch1.blocks.2.norm1.bias", "backbone.branch1.blocks.2.attn.qkv.bias", "backbone.branch1.blocks.2.attn.proj.bias", "backbone.branch1.blocks.2.norm2.weight", "backbone.branch1.blocks.2.norm2.bias", "backbone.branch1.blocks.2.mlp.fc1.bias", "backbone.branch1.blocks.2.mlp.fc2.bias" ], "lr_scale": 0.028003761208561607, "lr": 2.8003761208561607e-06, "weight_decay": 0.0 }, "layer_3_decay": { "param_names": [ "backbone.branch1.blocks.2.attn.qkv.weight", "backbone.branch1.blocks.2.attn.proj.weight", "backbone.branch1.blocks.2.mlp.fc1.weight", "backbone.branch1.blocks.2.mlp.fc2.weight" ], "lr_scale": 0.028003761208561607, "lr": 2.8003761208561607e-06, "weight_decay": 0.05 }, "layer_4_no_decay": { "param_names": [ "backbone.branch1.blocks.3.gamma_1", "backbone.branch1.blocks.3.gamma_2", "backbone.branch1.blocks.3.norm1.weight", "backbone.branch1.blocks.3.norm1.bias", "backbone.branch1.blocks.3.attn.qkv.bias", "backbone.branch1.blocks.3.attn.proj.bias", "backbone.branch1.blocks.3.norm2.weight", "backbone.branch1.blocks.3.norm2.bias", "backbone.branch1.blocks.3.mlp.fc1.bias", "backbone.branch1.blocks.3.mlp.fc2.bias", "backbone.branch2.blocks.1.gamma_1", "backbone.branch2.blocks.1.gamma_2", "backbone.branch2.blocks.1.norm1.weight", "backbone.branch2.blocks.1.norm1.bias", "backbone.branch2.blocks.1.attn.qkv.bias", "backbone.branch2.blocks.1.attn.proj.bias", "backbone.branch2.blocks.1.norm2.weight", "backbone.branch2.blocks.1.norm2.bias", "backbone.branch2.blocks.1.mlp.fc1.bias", "backbone.branch2.blocks.1.mlp.fc2.bias", "backbone.branch3.blocks.1.gamma_1", "backbone.branch3.blocks.1.gamma_2", "backbone.branch3.blocks.1.norm1.weight", "backbone.branch3.blocks.1.norm1.bias", "backbone.branch3.blocks.1.attn.qkv.bias", "backbone.branch3.blocks.1.attn.proj.bias", "backbone.branch3.blocks.1.norm2.weight", "backbone.branch3.blocks.1.norm2.bias", "backbone.branch3.blocks.1.mlp.fc1.bias", "backbone.branch3.blocks.1.mlp.fc2.bias" ], "lr_scale": 0.03294560142183718, "lr": 3.2945601421837183e-06, "weight_decay": 0.0 }, "layer_4_decay": { "param_names": [ "backbone.branch1.blocks.3.attn.qkv.weight", "backbone.branch1.blocks.3.attn.proj.weight", "backbone.branch1.blocks.3.mlp.fc1.weight", "backbone.branch1.blocks.3.mlp.fc2.weight", "backbone.branch2.blocks.1.attn.qkv.weight", "backbone.branch2.blocks.1.attn.proj.weight", "backbone.branch2.blocks.1.mlp.fc1.weight", "backbone.branch2.blocks.1.mlp.fc2.weight", "backbone.branch3.blocks.1.attn.qkv.weight", "backbone.branch3.blocks.1.attn.proj.weight", "backbone.branch3.blocks.1.mlp.fc1.weight", "backbone.branch3.blocks.1.mlp.fc2.weight" ], "lr_scale": 0.03294560142183718, "lr": 3.2945601421837183e-06, "weight_decay": 0.05 }, "layer_5_no_decay": { "param_names": [ "backbone.branch1.blocks.4.gamma_1", "backbone.branch1.blocks.4.gamma_2", "backbone.branch1.blocks.4.norm1.weight", "backbone.branch1.blocks.4.norm1.bias", "backbone.branch1.blocks.4.attn.qkv.bias", "backbone.branch1.blocks.4.attn.proj.bias", "backbone.branch1.blocks.4.norm2.weight", "backbone.branch1.blocks.4.norm2.bias", "backbone.branch1.blocks.4.mlp.fc1.bias", "backbone.branch1.blocks.4.mlp.fc2.bias" ], "lr_scale": 0.03875953108451433, "lr": 3.875953108451433e-06, "weight_decay": 0.0 }, "layer_5_decay": { "param_names": [ "backbone.branch1.blocks.4.attn.qkv.weight", "backbone.branch1.blocks.4.attn.proj.weight", "backbone.branch1.blocks.4.mlp.fc1.weight", "backbone.branch1.blocks.4.mlp.fc2.weight" ], "lr_scale": 0.03875953108451433, "lr": 3.875953108451433e-06, "weight_decay": 0.05 }, "layer_6_no_decay": { "param_names": [ "backbone.branch1.blocks.5.gamma_1", "backbone.branch1.blocks.5.gamma_2", "backbone.branch1.blocks.5.norm1.weight", "backbone.branch1.blocks.5.norm1.bias", "backbone.branch1.blocks.5.attn.qkv.bias", "backbone.branch1.blocks.5.attn.proj.bias", "backbone.branch1.blocks.5.norm2.weight", "backbone.branch1.blocks.5.norm2.bias", "backbone.branch1.blocks.5.mlp.fc1.bias", "backbone.branch1.blocks.5.mlp.fc2.bias", "backbone.branch2.blocks.2.gamma_1", "backbone.branch2.blocks.2.gamma_2", "backbone.branch2.blocks.2.norm1.weight", "backbone.branch2.blocks.2.norm1.bias", "backbone.branch2.blocks.2.attn.qkv.bias", "backbone.branch2.blocks.2.attn.proj.bias", "backbone.branch2.blocks.2.norm2.weight", "backbone.branch2.blocks.2.norm2.bias", "backbone.branch2.blocks.2.mlp.fc1.bias", "backbone.branch2.blocks.2.mlp.fc2.bias", "backbone.branch3.blocks.2.gamma_1", "backbone.branch3.blocks.2.gamma_2", "backbone.branch3.blocks.2.norm1.weight", "backbone.branch3.blocks.2.norm1.bias", "backbone.branch3.blocks.2.attn.qkv.bias", "backbone.branch3.blocks.2.attn.proj.bias", "backbone.branch3.blocks.2.norm2.weight", "backbone.branch3.blocks.2.norm2.bias", "backbone.branch3.blocks.2.mlp.fc1.bias", "backbone.branch3.blocks.2.mlp.fc2.bias" ], "lr_scale": 0.04559944833472275, "lr": 4.5599448334722756e-06, "weight_decay": 0.0 }, "layer_6_decay": { "param_names": [ "backbone.branch1.blocks.5.attn.qkv.weight", "backbone.branch1.blocks.5.attn.proj.weight", "backbone.branch1.blocks.5.mlp.fc1.weight", "backbone.branch1.blocks.5.mlp.fc2.weight", "backbone.branch2.blocks.2.attn.qkv.weight", "backbone.branch2.blocks.2.attn.proj.weight", "backbone.branch2.blocks.2.mlp.fc1.weight", "backbone.branch2.blocks.2.mlp.fc2.weight", "backbone.branch3.blocks.2.attn.qkv.weight", "backbone.branch3.blocks.2.attn.proj.weight", "backbone.branch3.blocks.2.mlp.fc1.weight", "backbone.branch3.blocks.2.mlp.fc2.weight" ], "lr_scale": 0.04559944833472275, "lr": 4.5599448334722756e-06, "weight_decay": 0.05 }, "layer_7_no_decay": { "param_names": [ "backbone.branch1.blocks.6.gamma_1", "backbone.branch1.blocks.6.gamma_2", "backbone.branch1.blocks.6.norm1.weight", "backbone.branch1.blocks.6.norm1.bias", "backbone.branch1.blocks.6.attn.qkv.bias", "backbone.branch1.blocks.6.attn.proj.bias", "backbone.branch1.blocks.6.norm2.weight", "backbone.branch1.blocks.6.norm2.bias", "backbone.branch1.blocks.6.mlp.fc1.bias", "backbone.branch1.blocks.6.mlp.fc2.bias" ], "lr_scale": 0.053646409805556176, "lr": 5.364640980555618e-06, "weight_decay": 0.0 }, "layer_7_decay": { "param_names": [ "backbone.branch1.blocks.6.attn.qkv.weight", "backbone.branch1.blocks.6.attn.proj.weight", "backbone.branch1.blocks.6.mlp.fc1.weight", "backbone.branch1.blocks.6.mlp.fc2.weight" ], "lr_scale": 0.053646409805556176, "lr": 5.364640980555618e-06, "weight_decay": 0.05 }, "layer_8_no_decay": { "param_names": [ "backbone.branch1.blocks.7.gamma_1", "backbone.branch1.blocks.7.gamma_2", "backbone.branch1.blocks.7.norm1.weight", "backbone.branch1.blocks.7.norm1.bias", "backbone.branch1.blocks.7.attn.qkv.bias", "backbone.branch1.blocks.7.attn.proj.bias", "backbone.branch1.blocks.7.norm2.weight", "backbone.branch1.blocks.7.norm2.bias", "backbone.branch1.blocks.7.mlp.fc1.bias", "backbone.branch1.blocks.7.mlp.fc2.bias", "backbone.branch2.blocks.3.gamma_1", "backbone.branch2.blocks.3.gamma_2", "backbone.branch2.blocks.3.norm1.weight", "backbone.branch2.blocks.3.norm1.bias", "backbone.branch2.blocks.3.attn.qkv.bias", "backbone.branch2.blocks.3.attn.proj.bias", "backbone.branch2.blocks.3.norm2.weight", "backbone.branch2.blocks.3.norm2.bias", "backbone.branch2.blocks.3.mlp.fc1.bias", "backbone.branch2.blocks.3.mlp.fc2.bias", "backbone.branch3.blocks.3.gamma_1", "backbone.branch3.blocks.3.gamma_2", "backbone.branch3.blocks.3.norm1.weight", "backbone.branch3.blocks.3.norm1.bias", "backbone.branch3.blocks.3.attn.qkv.bias", "backbone.branch3.blocks.3.attn.proj.bias", "backbone.branch3.blocks.3.norm2.weight", "backbone.branch3.blocks.3.norm2.bias", "backbone.branch3.blocks.3.mlp.fc1.bias", "backbone.branch3.blocks.3.mlp.fc2.bias" ], "lr_scale": 0.06311342330065432, "lr": 6.3113423300654325e-06, "weight_decay": 0.0 }, "layer_8_decay": { "param_names": [ "backbone.branch1.blocks.7.attn.qkv.weight", "backbone.branch1.blocks.7.attn.proj.weight", "backbone.branch1.blocks.7.mlp.fc1.weight", "backbone.branch1.blocks.7.mlp.fc2.weight", "backbone.branch2.blocks.3.attn.qkv.weight", "backbone.branch2.blocks.3.attn.proj.weight", "backbone.branch2.blocks.3.mlp.fc1.weight", "backbone.branch2.blocks.3.mlp.fc2.weight", "backbone.branch3.blocks.3.attn.qkv.weight", "backbone.branch3.blocks.3.attn.proj.weight", "backbone.branch3.blocks.3.mlp.fc1.weight", "backbone.branch3.blocks.3.mlp.fc2.weight" ], "lr_scale": 0.06311342330065432, "lr": 6.3113423300654325e-06, "weight_decay": 0.05 }, "layer_9_no_decay": { "param_names": [ "backbone.branch1.blocks.8.gamma_1", "backbone.branch1.blocks.8.gamma_2", "backbone.branch1.blocks.8.norm1.weight", "backbone.branch1.blocks.8.norm1.bias", "backbone.branch1.blocks.8.attn.qkv.bias", "backbone.branch1.blocks.8.attn.proj.bias", "backbone.branch1.blocks.8.norm2.weight", "backbone.branch1.blocks.8.norm2.bias", "backbone.branch1.blocks.8.mlp.fc1.bias", "backbone.branch1.blocks.8.mlp.fc2.bias" ], "lr_scale": 0.07425108623606391, "lr": 7.425108623606392e-06, "weight_decay": 0.0 }, "layer_9_decay": { "param_names": [ "backbone.branch1.blocks.8.attn.qkv.weight", "backbone.branch1.blocks.8.attn.proj.weight", "backbone.branch1.blocks.8.mlp.fc1.weight", "backbone.branch1.blocks.8.mlp.fc2.weight" ], "lr_scale": 0.07425108623606391, "lr": 7.425108623606392e-06, "weight_decay": 0.05 }, "layer_10_no_decay": { "param_names": [ "backbone.branch1.blocks.9.gamma_1", "backbone.branch1.blocks.9.gamma_2", "backbone.branch1.blocks.9.norm1.weight", "backbone.branch1.blocks.9.norm1.bias", "backbone.branch1.blocks.9.attn.qkv.bias", "backbone.branch1.blocks.9.attn.proj.bias", "backbone.branch1.blocks.9.norm2.weight", "backbone.branch1.blocks.9.norm2.bias", "backbone.branch1.blocks.9.mlp.fc1.bias", "backbone.branch1.blocks.9.mlp.fc2.bias", "backbone.branch2.blocks.4.gamma_1", "backbone.branch2.blocks.4.gamma_2", "backbone.branch2.blocks.4.norm1.weight", "backbone.branch2.blocks.4.norm1.bias", "backbone.branch2.blocks.4.attn.qkv.bias", "backbone.branch2.blocks.4.attn.proj.bias", "backbone.branch2.blocks.4.norm2.weight", "backbone.branch2.blocks.4.norm2.bias", "backbone.branch2.blocks.4.mlp.fc1.bias", "backbone.branch2.blocks.4.mlp.fc2.bias", "backbone.branch3.blocks.4.gamma_1", "backbone.branch3.blocks.4.gamma_2", "backbone.branch3.blocks.4.norm1.weight", "backbone.branch3.blocks.4.norm1.bias", "backbone.branch3.blocks.4.attn.qkv.bias", "backbone.branch3.blocks.4.attn.proj.bias", "backbone.branch3.blocks.4.norm2.weight", "backbone.branch3.blocks.4.norm2.bias", "backbone.branch3.blocks.4.mlp.fc1.bias", "backbone.branch3.blocks.4.mlp.fc2.bias" ], "lr_scale": 0.08735421910125167, "lr": 8.735421910125167e-06, "weight_decay": 0.0 }, "layer_10_decay": { "param_names": [ "backbone.branch1.blocks.9.attn.qkv.weight", "backbone.branch1.blocks.9.attn.proj.weight", "backbone.branch1.blocks.9.mlp.fc1.weight", "backbone.branch1.blocks.9.mlp.fc2.weight", "backbone.branch2.blocks.4.attn.qkv.weight", "backbone.branch2.blocks.4.attn.proj.weight", "backbone.branch2.blocks.4.mlp.fc1.weight", "backbone.branch2.blocks.4.mlp.fc2.weight", "backbone.branch3.blocks.4.attn.qkv.weight", "backbone.branch3.blocks.4.attn.proj.weight", "backbone.branch3.blocks.4.mlp.fc1.weight", "backbone.branch3.blocks.4.mlp.fc2.weight" ], "lr_scale": 0.08735421910125167, "lr": 8.735421910125167e-06, "weight_decay": 0.05 }, "layer_11_no_decay": { "param_names": [ "backbone.branch1.blocks.10.gamma_1", "backbone.branch1.blocks.10.gamma_2", "backbone.branch1.blocks.10.norm1.weight", "backbone.branch1.blocks.10.norm1.bias", "backbone.branch1.blocks.10.attn.qkv.bias", "backbone.branch1.blocks.10.attn.proj.bias", "backbone.branch1.blocks.10.norm2.weight", "backbone.branch1.blocks.10.norm2.bias", "backbone.branch1.blocks.10.mlp.fc1.bias", "backbone.branch1.blocks.10.mlp.fc2.bias" ], "lr_scale": 0.10276966953088432, "lr": 1.0276966953088432e-05, "weight_decay": 0.0 }, "layer_11_decay": { "param_names": [ "backbone.branch1.blocks.10.attn.qkv.weight", "backbone.branch1.blocks.10.attn.proj.weight", "backbone.branch1.blocks.10.mlp.fc1.weight", "backbone.branch1.blocks.10.mlp.fc2.weight" ], "lr_scale": 0.10276966953088432, "lr": 1.0276966953088432e-05, "weight_decay": 0.05 }, "layer_12_no_decay": { "param_names": [ "backbone.branch1.blocks.11.gamma_1", "backbone.branch1.blocks.11.gamma_2", "backbone.branch1.blocks.11.norm1.weight", "backbone.branch1.blocks.11.norm1.bias", "backbone.branch1.blocks.11.attn.qkv.bias", "backbone.branch1.blocks.11.attn.proj.bias", "backbone.branch1.blocks.11.norm2.weight", "backbone.branch1.blocks.11.norm2.bias", "backbone.branch1.blocks.11.mlp.fc1.bias", "backbone.branch1.blocks.11.mlp.fc2.bias", "backbone.branch2.blocks.5.gamma_1", "backbone.branch2.blocks.5.gamma_2", "backbone.branch2.blocks.5.norm1.weight", "backbone.branch2.blocks.5.norm1.bias", "backbone.branch2.blocks.5.attn.qkv.bias", "backbone.branch2.blocks.5.attn.proj.bias", "backbone.branch2.blocks.5.norm2.weight", "backbone.branch2.blocks.5.norm2.bias", "backbone.branch2.blocks.5.mlp.fc1.bias", "backbone.branch2.blocks.5.mlp.fc2.bias", "backbone.branch3.blocks.5.gamma_1", "backbone.branch3.blocks.5.gamma_2", "backbone.branch3.blocks.5.norm1.weight", "backbone.branch3.blocks.5.norm1.bias", "backbone.branch3.blocks.5.attn.qkv.bias", "backbone.branch3.blocks.5.attn.proj.bias", "backbone.branch3.blocks.5.norm2.weight", "backbone.branch3.blocks.5.norm2.bias", "backbone.branch3.blocks.5.mlp.fc1.bias", "backbone.branch3.blocks.5.mlp.fc2.bias" ], "lr_scale": 0.12090549356574626, "lr": 1.2090549356574626e-05, "weight_decay": 0.0 }, "layer_12_decay": { "param_names": [ "backbone.branch1.blocks.11.attn.qkv.weight", "backbone.branch1.blocks.11.attn.proj.weight", "backbone.branch1.blocks.11.mlp.fc1.weight", "backbone.branch1.blocks.11.mlp.fc2.weight", "backbone.branch2.blocks.5.attn.qkv.weight", "backbone.branch2.blocks.5.attn.proj.weight", "backbone.branch2.blocks.5.mlp.fc1.weight", "backbone.branch2.blocks.5.mlp.fc2.weight", "backbone.branch3.blocks.5.attn.qkv.weight", "backbone.branch3.blocks.5.attn.proj.weight", "backbone.branch3.blocks.5.mlp.fc1.weight", "backbone.branch3.blocks.5.mlp.fc2.weight" ], "lr_scale": 0.12090549356574626, "lr": 1.2090549356574626e-05, "weight_decay": 0.05 }, "layer_13_no_decay": { "param_names": [ "backbone.branch1.blocks.12.gamma_1", "backbone.branch1.blocks.12.gamma_2", "backbone.branch1.blocks.12.norm1.weight", "backbone.branch1.blocks.12.norm1.bias", "backbone.branch1.blocks.12.attn.qkv.bias", "backbone.branch1.blocks.12.attn.proj.bias", "backbone.branch1.blocks.12.norm2.weight", "backbone.branch1.blocks.12.norm2.bias", "backbone.branch1.blocks.12.mlp.fc1.bias", "backbone.branch1.blocks.12.mlp.fc2.bias" ], "lr_scale": 0.14224175713617207, "lr": 1.4224175713617208e-05, "weight_decay": 0.0 }, "layer_13_decay": { "param_names": [ "backbone.branch1.blocks.12.attn.qkv.weight", "backbone.branch1.blocks.12.attn.proj.weight", "backbone.branch1.blocks.12.mlp.fc1.weight", "backbone.branch1.blocks.12.mlp.fc2.weight" ], "lr_scale": 0.14224175713617207, "lr": 1.4224175713617208e-05, "weight_decay": 0.05 }, "layer_14_no_decay": { "param_names": [ "backbone.branch1.blocks.13.gamma_1", "backbone.branch1.blocks.13.gamma_2", "backbone.branch1.blocks.13.norm1.weight", "backbone.branch1.blocks.13.norm1.bias", "backbone.branch1.blocks.13.attn.qkv.bias", "backbone.branch1.blocks.13.attn.proj.bias", "backbone.branch1.blocks.13.norm2.weight", "backbone.branch1.blocks.13.norm2.bias", "backbone.branch1.blocks.13.mlp.fc1.bias", "backbone.branch1.blocks.13.mlp.fc2.bias", "backbone.branch2.blocks.6.gamma_1", "backbone.branch2.blocks.6.gamma_2", "backbone.branch2.blocks.6.norm1.weight", "backbone.branch2.blocks.6.norm1.bias", "backbone.branch2.blocks.6.attn.qkv.bias", "backbone.branch2.blocks.6.attn.proj.bias", "backbone.branch2.blocks.6.norm2.weight", "backbone.branch2.blocks.6.norm2.bias", "backbone.branch2.blocks.6.mlp.fc1.bias", "backbone.branch2.blocks.6.mlp.fc2.bias", "backbone.branch3.blocks.6.gamma_1", "backbone.branch3.blocks.6.gamma_2", "backbone.branch3.blocks.6.norm1.weight", "backbone.branch3.blocks.6.norm1.bias", "backbone.branch3.blocks.6.attn.qkv.bias", "backbone.branch3.blocks.6.attn.proj.bias", "backbone.branch3.blocks.6.norm2.weight", "backbone.branch3.blocks.6.norm2.bias", "backbone.branch3.blocks.6.mlp.fc1.bias", "backbone.branch3.blocks.6.mlp.fc2.bias" ], "lr_scale": 0.1673432436896142, "lr": 1.673432436896142e-05, "weight_decay": 0.0 }, "layer_14_decay": { "param_names": [ "backbone.branch1.blocks.13.attn.qkv.weight", "backbone.branch1.blocks.13.attn.proj.weight", "backbone.branch1.blocks.13.mlp.fc1.weight", "backbone.branch1.blocks.13.mlp.fc2.weight", "backbone.branch2.blocks.6.attn.qkv.weight", "backbone.branch2.blocks.6.attn.proj.weight", "backbone.branch2.blocks.6.mlp.fc1.weight", "backbone.branch2.blocks.6.mlp.fc2.weight", "backbone.branch3.blocks.6.attn.qkv.weight", "backbone.branch3.blocks.6.attn.proj.weight", "backbone.branch3.blocks.6.mlp.fc1.weight", "backbone.branch3.blocks.6.mlp.fc2.weight" ], "lr_scale": 0.1673432436896142, "lr": 1.673432436896142e-05, "weight_decay": 0.05 }, "layer_15_no_decay": { "param_names": [ "backbone.branch1.blocks.14.gamma_1", "backbone.branch1.blocks.14.gamma_2", "backbone.branch1.blocks.14.norm1.weight", "backbone.branch1.blocks.14.norm1.bias", "backbone.branch1.blocks.14.attn.qkv.bias", "backbone.branch1.blocks.14.attn.proj.bias", "backbone.branch1.blocks.14.norm2.weight", "backbone.branch1.blocks.14.norm2.bias", "backbone.branch1.blocks.14.mlp.fc1.bias", "backbone.branch1.blocks.14.mlp.fc2.bias" ], "lr_scale": 0.1968744043407226, "lr": 1.968744043407226e-05, "weight_decay": 0.0 }, "layer_15_decay": { "param_names": [ "backbone.branch1.blocks.14.attn.qkv.weight", "backbone.branch1.blocks.14.attn.proj.weight", "backbone.branch1.blocks.14.mlp.fc1.weight", "backbone.branch1.blocks.14.mlp.fc2.weight" ], "lr_scale": 0.1968744043407226, "lr": 1.968744043407226e-05, "weight_decay": 0.05 }, "layer_16_no_decay": { "param_names": [ "backbone.branch1.blocks.15.gamma_1", "backbone.branch1.blocks.15.gamma_2", "backbone.branch1.blocks.15.norm1.weight", "backbone.branch1.blocks.15.norm1.bias", "backbone.branch1.blocks.15.attn.qkv.bias", "backbone.branch1.blocks.15.attn.proj.bias", "backbone.branch1.blocks.15.norm2.weight", "backbone.branch1.blocks.15.norm2.bias", "backbone.branch1.blocks.15.mlp.fc1.bias", "backbone.branch1.blocks.15.mlp.fc2.bias", "backbone.branch2.blocks.7.gamma_1", "backbone.branch2.blocks.7.gamma_2", "backbone.branch2.blocks.7.norm1.weight", "backbone.branch2.blocks.7.norm1.bias", "backbone.branch2.blocks.7.attn.qkv.bias", "backbone.branch2.blocks.7.attn.proj.bias", "backbone.branch2.blocks.7.norm2.weight", "backbone.branch2.blocks.7.norm2.bias", "backbone.branch2.blocks.7.mlp.fc1.bias", "backbone.branch2.blocks.7.mlp.fc2.bias", "backbone.branch3.blocks.7.gamma_1", "backbone.branch3.blocks.7.gamma_2", "backbone.branch3.blocks.7.norm1.weight", "backbone.branch3.blocks.7.norm1.bias", "backbone.branch3.blocks.7.attn.qkv.bias", "backbone.branch3.blocks.7.attn.proj.bias", "backbone.branch3.blocks.7.norm2.weight", "backbone.branch3.blocks.7.norm2.bias", "backbone.branch3.blocks.7.mlp.fc1.bias", "backbone.branch3.blocks.7.mlp.fc2.bias" ], "lr_scale": 0.23161694628320306, "lr": 2.3161694628320308e-05, "weight_decay": 0.0 }, "layer_16_decay": { "param_names": [ "backbone.branch1.blocks.15.attn.qkv.weight", "backbone.branch1.blocks.15.attn.proj.weight", "backbone.branch1.blocks.15.mlp.fc1.weight", "backbone.branch1.blocks.15.mlp.fc2.weight", "backbone.branch2.blocks.7.attn.qkv.weight", "backbone.branch2.blocks.7.attn.proj.weight", "backbone.branch2.blocks.7.mlp.fc1.weight", "backbone.branch2.blocks.7.mlp.fc2.weight", "backbone.branch3.blocks.7.attn.qkv.weight", "backbone.branch3.blocks.7.attn.proj.weight", "backbone.branch3.blocks.7.mlp.fc1.weight", "backbone.branch3.blocks.7.mlp.fc2.weight" ], "lr_scale": 0.23161694628320306, "lr": 2.3161694628320308e-05, "weight_decay": 0.05 }, "layer_17_no_decay": { "param_names": [ "backbone.branch1.blocks.16.gamma_1", "backbone.branch1.blocks.16.gamma_2", "backbone.branch1.blocks.16.norm1.weight", "backbone.branch1.blocks.16.norm1.bias", "backbone.branch1.blocks.16.attn.qkv.bias", "backbone.branch1.blocks.16.attn.proj.bias", "backbone.branch1.blocks.16.norm2.weight", "backbone.branch1.blocks.16.norm2.bias", "backbone.branch1.blocks.16.mlp.fc1.bias", "backbone.branch1.blocks.16.mlp.fc2.bias" ], "lr_scale": 0.27249052503906246, "lr": 2.7249052503906248e-05, "weight_decay": 0.0 }, "layer_17_decay": { "param_names": [ "backbone.branch1.blocks.16.attn.qkv.weight", "backbone.branch1.blocks.16.attn.proj.weight", "backbone.branch1.blocks.16.mlp.fc1.weight", "backbone.branch1.blocks.16.mlp.fc2.weight" ], "lr_scale": 0.27249052503906246, "lr": 2.7249052503906248e-05, "weight_decay": 0.05 }, "layer_18_no_decay": { "param_names": [ "backbone.branch1.blocks.17.gamma_1", "backbone.branch1.blocks.17.gamma_2", "backbone.branch1.blocks.17.norm1.weight", "backbone.branch1.blocks.17.norm1.bias", "backbone.branch1.blocks.17.attn.qkv.bias", "backbone.branch1.blocks.17.attn.proj.bias", "backbone.branch1.blocks.17.norm2.weight", "backbone.branch1.blocks.17.norm2.bias", "backbone.branch1.blocks.17.mlp.fc1.bias", "backbone.branch1.blocks.17.mlp.fc2.bias", "backbone.branch2.blocks.8.gamma_1", "backbone.branch2.blocks.8.gamma_2", "backbone.branch2.blocks.8.norm1.weight", "backbone.branch2.blocks.8.norm1.bias", "backbone.branch2.blocks.8.attn.qkv.bias", "backbone.branch2.blocks.8.attn.proj.bias", "backbone.branch2.blocks.8.norm2.weight", "backbone.branch2.blocks.8.norm2.bias", "backbone.branch2.blocks.8.mlp.fc1.bias", "backbone.branch2.blocks.8.mlp.fc2.bias", "backbone.branch3.blocks.8.gamma_1", "backbone.branch3.blocks.8.gamma_2", "backbone.branch3.blocks.8.norm1.weight", "backbone.branch3.blocks.8.norm1.bias", "backbone.branch3.blocks.8.attn.qkv.bias", "backbone.branch3.blocks.8.attn.proj.bias", "backbone.branch3.blocks.8.norm2.weight", "backbone.branch3.blocks.8.norm2.bias", "backbone.branch3.blocks.8.mlp.fc1.bias", "backbone.branch3.blocks.8.mlp.fc2.bias" ], "lr_scale": 0.3205770882812499, "lr": 3.2057708828124995e-05, "weight_decay": 0.0 }, "layer_18_decay": { "param_names": [ "backbone.branch1.blocks.17.attn.qkv.weight", "backbone.branch1.blocks.17.attn.proj.weight", "backbone.branch1.blocks.17.mlp.fc1.weight", "backbone.branch1.blocks.17.mlp.fc2.weight", "backbone.branch2.blocks.8.attn.qkv.weight", "backbone.branch2.blocks.8.attn.proj.weight", "backbone.branch2.blocks.8.mlp.fc1.weight", "backbone.branch2.blocks.8.mlp.fc2.weight", "backbone.branch3.blocks.8.attn.qkv.weight", "backbone.branch3.blocks.8.attn.proj.weight", "backbone.branch3.blocks.8.mlp.fc1.weight", "backbone.branch3.blocks.8.mlp.fc2.weight" ], "lr_scale": 0.3205770882812499, "lr": 3.2057708828124995e-05, "weight_decay": 0.05 }, "layer_19_no_decay": { "param_names": [ "backbone.branch1.blocks.18.gamma_1", "backbone.branch1.blocks.18.gamma_2", "backbone.branch1.blocks.18.norm1.weight", "backbone.branch1.blocks.18.norm1.bias", "backbone.branch1.blocks.18.attn.qkv.bias", "backbone.branch1.blocks.18.attn.proj.bias", "backbone.branch1.blocks.18.norm2.weight", "backbone.branch1.blocks.18.norm2.bias", "backbone.branch1.blocks.18.mlp.fc1.bias", "backbone.branch1.blocks.18.mlp.fc2.bias" ], "lr_scale": 0.37714951562499993, "lr": 3.77149515625e-05, "weight_decay": 0.0 }, "layer_19_decay": { "param_names": [ "backbone.branch1.blocks.18.attn.qkv.weight", "backbone.branch1.blocks.18.attn.proj.weight", "backbone.branch1.blocks.18.mlp.fc1.weight", "backbone.branch1.blocks.18.mlp.fc2.weight" ], "lr_scale": 0.37714951562499993, "lr": 3.77149515625e-05, "weight_decay": 0.05 }, "layer_20_no_decay": { "param_names": [ "backbone.branch1.blocks.19.gamma_1", "backbone.branch1.blocks.19.gamma_2", "backbone.branch1.blocks.19.norm1.weight", "backbone.branch1.blocks.19.norm1.bias", "backbone.branch1.blocks.19.attn.qkv.bias", "backbone.branch1.blocks.19.attn.proj.bias", "backbone.branch1.blocks.19.norm2.weight", "backbone.branch1.blocks.19.norm2.bias", "backbone.branch1.blocks.19.mlp.fc1.bias", "backbone.branch1.blocks.19.mlp.fc2.bias", "backbone.branch2.blocks.9.gamma_1", "backbone.branch2.blocks.9.gamma_2", "backbone.branch2.blocks.9.norm1.weight", "backbone.branch2.blocks.9.norm1.bias", "backbone.branch2.blocks.9.attn.qkv.bias", "backbone.branch2.blocks.9.attn.proj.bias", "backbone.branch2.blocks.9.norm2.weight", "backbone.branch2.blocks.9.norm2.bias", "backbone.branch2.blocks.9.mlp.fc1.bias", "backbone.branch2.blocks.9.mlp.fc2.bias", "backbone.branch3.blocks.9.gamma_1", "backbone.branch3.blocks.9.gamma_2", "backbone.branch3.blocks.9.norm1.weight", "backbone.branch3.blocks.9.norm1.bias", "backbone.branch3.blocks.9.attn.qkv.bias", "backbone.branch3.blocks.9.attn.proj.bias", "backbone.branch3.blocks.9.norm2.weight", "backbone.branch3.blocks.9.norm2.bias", "backbone.branch3.blocks.9.mlp.fc1.bias", "backbone.branch3.blocks.9.mlp.fc2.bias" ], "lr_scale": 0.44370531249999995, "lr": 4.4370531249999995e-05, "weight_decay": 0.0 }, "layer_20_decay": { "param_names": [ "backbone.branch1.blocks.19.attn.qkv.weight", "backbone.branch1.blocks.19.attn.proj.weight", "backbone.branch1.blocks.19.mlp.fc1.weight", "backbone.branch1.blocks.19.mlp.fc2.weight", "backbone.branch2.blocks.9.attn.qkv.weight", "backbone.branch2.blocks.9.attn.proj.weight", "backbone.branch2.blocks.9.mlp.fc1.weight", "backbone.branch2.blocks.9.mlp.fc2.weight", "backbone.branch3.blocks.9.attn.qkv.weight", "backbone.branch3.blocks.9.attn.proj.weight", "backbone.branch3.blocks.9.mlp.fc1.weight", "backbone.branch3.blocks.9.mlp.fc2.weight" ], "lr_scale": 0.44370531249999995, "lr": 4.4370531249999995e-05, "weight_decay": 0.05 }, "layer_21_no_decay": { "param_names": [ "backbone.branch1.blocks.20.gamma_1", "backbone.branch1.blocks.20.gamma_2", "backbone.branch1.blocks.20.norm1.weight", "backbone.branch1.blocks.20.norm1.bias", "backbone.branch1.blocks.20.attn.qkv.bias", "backbone.branch1.blocks.20.attn.proj.bias", "backbone.branch1.blocks.20.norm2.weight", "backbone.branch1.blocks.20.norm2.bias", "backbone.branch1.blocks.20.mlp.fc1.bias", "backbone.branch1.blocks.20.mlp.fc2.bias" ], "lr_scale": 0.5220062499999999, "lr": 5.220062499999999e-05, "weight_decay": 0.0 }, "layer_21_decay": { "param_names": [ "backbone.branch1.blocks.20.attn.qkv.weight", "backbone.branch1.blocks.20.attn.proj.weight", "backbone.branch1.blocks.20.mlp.fc1.weight", "backbone.branch1.blocks.20.mlp.fc2.weight" ], "lr_scale": 0.5220062499999999, "lr": 5.220062499999999e-05, "weight_decay": 0.05 }, "layer_22_no_decay": { "param_names": [ "backbone.branch1.blocks.21.gamma_1", "backbone.branch1.blocks.21.gamma_2", "backbone.branch1.blocks.21.norm1.weight", "backbone.branch1.blocks.21.norm1.bias", "backbone.branch1.blocks.21.attn.qkv.bias", "backbone.branch1.blocks.21.attn.proj.bias", "backbone.branch1.blocks.21.norm2.weight", "backbone.branch1.blocks.21.norm2.bias", "backbone.branch1.blocks.21.mlp.fc1.bias", "backbone.branch1.blocks.21.mlp.fc2.bias", "backbone.branch2.blocks.10.gamma_1", "backbone.branch2.blocks.10.gamma_2", "backbone.branch2.blocks.10.norm1.weight", "backbone.branch2.blocks.10.norm1.bias", "backbone.branch2.blocks.10.attn.qkv.bias", "backbone.branch2.blocks.10.attn.proj.bias", "backbone.branch2.blocks.10.norm2.weight", "backbone.branch2.blocks.10.norm2.bias", "backbone.branch2.blocks.10.mlp.fc1.bias", "backbone.branch2.blocks.10.mlp.fc2.bias", "backbone.branch3.blocks.10.gamma_1", "backbone.branch3.blocks.10.gamma_2", "backbone.branch3.blocks.10.norm1.weight", "backbone.branch3.blocks.10.norm1.bias", "backbone.branch3.blocks.10.attn.qkv.bias", "backbone.branch3.blocks.10.attn.proj.bias", "backbone.branch3.blocks.10.norm2.weight", "backbone.branch3.blocks.10.norm2.bias", "backbone.branch3.blocks.10.mlp.fc1.bias", "backbone.branch3.blocks.10.mlp.fc2.bias" ], "lr_scale": 0.6141249999999999, "lr": 6.14125e-05, "weight_decay": 0.0 }, "layer_22_decay": { "param_names": [ "backbone.branch1.blocks.21.attn.qkv.weight", "backbone.branch1.blocks.21.attn.proj.weight", "backbone.branch1.blocks.21.mlp.fc1.weight", "backbone.branch1.blocks.21.mlp.fc2.weight", "backbone.branch2.blocks.10.attn.qkv.weight", "backbone.branch2.blocks.10.attn.proj.weight", "backbone.branch2.blocks.10.mlp.fc1.weight", "backbone.branch2.blocks.10.mlp.fc2.weight", "backbone.branch3.blocks.10.attn.qkv.weight", "backbone.branch3.blocks.10.attn.proj.weight", "backbone.branch3.blocks.10.mlp.fc1.weight", "backbone.branch3.blocks.10.mlp.fc2.weight" ], "lr_scale": 0.6141249999999999, "lr": 6.14125e-05, "weight_decay": 0.05 }, "layer_23_no_decay": { "param_names": [ "backbone.branch1.blocks.22.gamma_1", "backbone.branch1.blocks.22.gamma_2", "backbone.branch1.blocks.22.norm1.weight", "backbone.branch1.blocks.22.norm1.bias", "backbone.branch1.blocks.22.attn.qkv.bias", "backbone.branch1.blocks.22.attn.proj.bias", "backbone.branch1.blocks.22.norm2.weight", "backbone.branch1.blocks.22.norm2.bias", "backbone.branch1.blocks.22.mlp.fc1.bias", "backbone.branch1.blocks.22.mlp.fc2.bias" ], "lr_scale": 0.7224999999999999, "lr": 7.225e-05, "weight_decay": 0.0 }, "layer_23_decay": { "param_names": [ "backbone.branch1.blocks.22.attn.qkv.weight", "backbone.branch1.blocks.22.attn.proj.weight", "backbone.branch1.blocks.22.mlp.fc1.weight", "backbone.branch1.blocks.22.mlp.fc2.weight" ], "lr_scale": 0.7224999999999999, "lr": 7.225e-05, "weight_decay": 0.05 }, "layer_24_no_decay": { "param_names": [ "backbone.branch1.blocks.23.gamma_1", "backbone.branch1.blocks.23.gamma_2", "backbone.branch1.blocks.23.norm1.weight", "backbone.branch1.blocks.23.norm1.bias", "backbone.branch1.blocks.23.attn.qkv.bias", "backbone.branch1.blocks.23.attn.proj.bias", "backbone.branch1.blocks.23.norm2.weight", "backbone.branch1.blocks.23.norm2.bias", "backbone.branch1.blocks.23.mlp.fc1.bias", "backbone.branch1.blocks.23.mlp.fc2.bias", "backbone.branch2.blocks.11.gamma_1", "backbone.branch2.blocks.11.gamma_2", "backbone.branch2.blocks.11.norm1.weight", "backbone.branch2.blocks.11.norm1.bias", "backbone.branch2.blocks.11.attn.qkv.bias", "backbone.branch2.blocks.11.attn.proj.bias", "backbone.branch2.blocks.11.norm2.weight", "backbone.branch2.blocks.11.norm2.bias", "backbone.branch2.blocks.11.mlp.fc1.bias", "backbone.branch2.blocks.11.mlp.fc2.bias", "backbone.branch3.blocks.11.gamma_1", "backbone.branch3.blocks.11.gamma_2", "backbone.branch3.blocks.11.norm1.weight", "backbone.branch3.blocks.11.norm1.bias", "backbone.branch3.blocks.11.attn.qkv.bias", "backbone.branch3.blocks.11.attn.proj.bias", "backbone.branch3.blocks.11.norm2.weight", "backbone.branch3.blocks.11.norm2.bias", "backbone.branch3.blocks.11.mlp.fc1.bias", "backbone.branch3.blocks.11.mlp.fc2.bias" ], "lr_scale": 0.85, "lr": 8.5e-05, "weight_decay": 0.0 }, "layer_24_decay": { "param_names": [ "backbone.branch1.blocks.23.attn.qkv.weight", "backbone.branch1.blocks.23.attn.proj.weight", "backbone.branch1.blocks.23.mlp.fc1.weight", "backbone.branch1.blocks.23.mlp.fc2.weight", "backbone.branch2.blocks.11.attn.qkv.weight", "backbone.branch2.blocks.11.attn.proj.weight", "backbone.branch2.blocks.11.mlp.fc1.weight", "backbone.branch2.blocks.11.mlp.fc2.weight", "backbone.branch3.blocks.11.attn.qkv.weight", "backbone.branch3.blocks.11.attn.proj.weight", "backbone.branch3.blocks.11.mlp.fc1.weight", "backbone.branch3.blocks.11.mlp.fc2.weight" ], "lr_scale": 0.85, "lr": 8.5e-05, "weight_decay": 0.05 }, "layer_25_no_decay": { "param_names": [ "backbone.interactions.0.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.merge_branch1.1.weight", "backbone.merge_branch1.1.bias", "backbone.merge_branch1.4.weight", "backbone.merge_branch1.4.bias", "backbone.merge_branch2.1.weight", "backbone.merge_branch2.1.bias", "backbone.merge_branch2.4.weight", "backbone.merge_branch2.4.bias", "backbone.merge_branch3.1.weight", "backbone.merge_branch3.1.bias", "backbone.merge_branch3.4.weight", "backbone.merge_branch3.4.bias", "backbone.fpn1.0.bias", "backbone.fpn1.1.weight", "backbone.fpn1.1.bias", "backbone.fpn1.3.bias", "backbone.fpn2.0.bias", "neck.lateral_convs.0.conv.bias", "neck.lateral_convs.1.conv.bias", "neck.lateral_convs.2.conv.bias", "neck.lateral_convs.3.conv.bias", "neck.fpn_convs.0.conv.bias", "neck.fpn_convs.1.conv.bias", "neck.fpn_convs.2.conv.bias", "neck.fpn_convs.3.conv.bias", "rpn_head.rpn_conv.bias", "rpn_head.rpn_cls.bias", "rpn_head.rpn_reg.bias", "roi_head.bbox_head.fc_cls.bias", "roi_head.bbox_head.fc_reg.bias", "roi_head.bbox_head.shared_fcs.0.bias", "roi_head.bbox_head.shared_fcs.1.bias", "roi_head.mask_head.convs.0.conv.bias", "roi_head.mask_head.convs.1.conv.bias", "roi_head.mask_head.convs.2.conv.bias", "roi_head.mask_head.convs.3.conv.bias", "roi_head.mask_head.upsample.bias", "roi_head.mask_head.conv_logits.bias" ], "lr_scale": 1.0, "lr": 0.0001, "weight_decay": 0.0 } } 2024-05-30 21:55:38,213 - mmdet - INFO - Automatic scaling of learning rate (LR) has been disabled. 2024-05-30 21:55:38,629 - mmdet - INFO - Start running, work_dir: /mnt/petrelfs/PIIP/mmdetection/work_dirs/mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16 2024-05-30 21:55:38,629 - mmdet - INFO - Hooks will be executed in the following order: before_run: (VERY_HIGH ) StepLrUpdaterHook (49 ) ToBFloat16HookMMDet (NORMAL ) DeepspeedCheckpointHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_epoch: (VERY_HIGH ) StepLrUpdaterHook (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_iter: (VERY_HIGH ) StepLrUpdaterHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook -------------------- after_train_iter: (ABOVE_NORMAL) OptimizerHook (NORMAL ) DeepspeedCheckpointHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- after_train_epoch: (NORMAL ) DeepspeedCheckpointHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_val_epoch: (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (VERY_LOW ) TextLoggerHook -------------------- before_val_iter: (LOW ) IterTimerHook -------------------- after_val_iter: (LOW ) IterTimerHook -------------------- after_val_epoch: (VERY_LOW ) TextLoggerHook -------------------- after_run: (VERY_LOW ) TextLoggerHook -------------------- 2024-05-30 21:55:38,629 - mmdet - INFO - workflow: [('train', 1)], max: 12 epochs 2024-05-30 21:55:38,640 - mmdet - INFO - Checkpoints will be saved to /mnt/petrelfs/PIIP/mmdetection/work_dirs/mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16 by HardDiskBackend. 2024-05-30 21:56:25,292 - mmdet - INFO - Epoch [1][50/7330] lr: 9.890e-06, eta: 22:46:46, time: 0.933, data_time: 0.127, memory: 18532, loss_rpn_cls: 0.6847, loss_rpn_bbox: 0.1290, loss_cls: 1.6980, acc: 68.1736, loss_bbox: 0.0280, loss_mask: 1.2503, loss: 3.7900 2024-05-30 21:57:04,781 - mmdet - INFO - Epoch [1][100/7330] lr: 1.988e-05, eta: 21:01:15, time: 0.790, data_time: 0.058, memory: 18572, loss_rpn_cls: 0.3267, loss_rpn_bbox: 0.1127, loss_cls: 0.3465, acc: 95.7708, loss_bbox: 0.1245, loss_mask: 0.7615, loss: 1.6719 2024-05-30 21:57:43,497 - mmdet - INFO - Epoch [1][150/7330] lr: 2.987e-05, eta: 20:18:05, time: 0.774, data_time: 0.046, memory: 18572, loss_rpn_cls: 0.2481, loss_rpn_bbox: 0.1037, loss_cls: 0.3191, acc: 95.4709, loss_bbox: 0.1431, loss_mask: 0.6995, loss: 1.5135 2024-05-30 21:58:23,164 - mmdet - INFO - Epoch [1][200/7330] lr: 3.986e-05, eta: 20:03:05, time: 0.793, data_time: 0.044, memory: 18572, loss_rpn_cls: 0.2309, loss_rpn_bbox: 0.1084, loss_cls: 0.3064, acc: 95.3860, loss_bbox: 0.1471, loss_mask: 0.6842, loss: 1.4770 2024-05-30 21:59:02,255 - mmdet - INFO - Epoch [1][250/7330] lr: 4.985e-05, eta: 19:50:31, time: 0.782, data_time: 0.047, memory: 18572, loss_rpn_cls: 0.2085, loss_rpn_bbox: 0.1042, loss_cls: 0.3667, acc: 94.4424, loss_bbox: 0.1828, loss_mask: 0.6705, loss: 1.5327 2024-05-30 21:59:40,788 - mmdet - INFO - Epoch [1][300/7330] lr: 5.984e-05, eta: 19:39:12, time: 0.771, data_time: 0.047, memory: 18619, loss_rpn_cls: 0.1516, loss_rpn_bbox: 0.0959, loss_cls: 0.4159, acc: 93.6265, loss_bbox: 0.2259, loss_mask: 0.6527, loss: 1.5421 2024-05-30 22:00:19,966 - mmdet - INFO - Epoch [1][350/7330] lr: 6.983e-05, eta: 19:33:36, time: 0.784, data_time: 0.052, memory: 18702, loss_rpn_cls: 0.1383, loss_rpn_bbox: 0.1015, loss_cls: 0.4354, acc: 92.6968, loss_bbox: 0.2607, loss_mask: 0.6196, loss: 1.5555 2024-05-30 22:00:59,342 - mmdet - INFO - Epoch [1][400/7330] lr: 7.982e-05, eta: 19:29:58, time: 0.787, data_time: 0.054, memory: 18733, loss_rpn_cls: 0.1229, loss_rpn_bbox: 0.1004, loss_cls: 0.4604, acc: 91.9336, loss_bbox: 0.2910, loss_mask: 0.5960, loss: 1.5707 2024-05-30 22:01:38,955 - mmdet - INFO - Epoch [1][450/7330] lr: 8.981e-05, eta: 19:27:46, time: 0.792, data_time: 0.051, memory: 18841, loss_rpn_cls: 0.1160, loss_rpn_bbox: 0.1012, loss_cls: 0.4634, acc: 91.4348, loss_bbox: 0.3046, loss_mask: 0.5675, loss: 1.5527 2024-05-30 22:02:18,646 - mmdet - INFO - Epoch [1][500/7330] lr: 9.980e-05, eta: 19:26:06, time: 0.794, data_time: 0.045, memory: 18856, loss_rpn_cls: 0.1039, loss_rpn_bbox: 0.0984, loss_cls: 0.4263, acc: 91.7036, loss_bbox: 0.2928, loss_mask: 0.5388, loss: 1.4602 2024-05-30 22:02:57,850 - mmdet - INFO - Epoch [1][550/7330] lr: 1.000e-04, eta: 19:23:19, time: 0.784, data_time: 0.041, memory: 18856, loss_rpn_cls: 0.0948, loss_rpn_bbox: 0.0913, loss_cls: 0.4413, acc: 91.3042, loss_bbox: 0.3109, loss_mask: 0.5185, loss: 1.4568 2024-05-30 22:03:38,060 - mmdet - INFO - Epoch [1][600/7330] lr: 1.000e-04, eta: 19:23:17, time: 0.804, data_time: 0.057, memory: 18856, loss_rpn_cls: 0.0936, loss_rpn_bbox: 0.0976, loss_cls: 0.4435, acc: 90.3984, loss_bbox: 0.3399, loss_mask: 0.5034, loss: 1.4780 2024-05-30 22:04:17,762 - mmdet - INFO - Epoch [1][650/7330] lr: 1.000e-04, eta: 19:22:07, time: 0.795, data_time: 0.053, memory: 18856, loss_rpn_cls: 0.0882, loss_rpn_bbox: 0.0926, loss_cls: 0.4250, acc: 90.5796, loss_bbox: 0.3325, loss_mask: 0.4932, loss: 1.4316 2024-05-30 22:05:11,049 - mmdet - INFO - Epoch [1][700/7330] lr: 1.000e-04, eta: 19:49:12, time: 1.066, data_time: 0.056, memory: 18856, loss_rpn_cls: 0.0844, loss_rpn_bbox: 0.0906, loss_cls: 0.4022, acc: 90.7324, loss_bbox: 0.3268, loss_mask: 0.4737, loss: 1.3776 2024-05-30 22:05:50,921 - mmdet - INFO - Epoch [1][750/7330] lr: 1.000e-04, eta: 19:46:32, time: 0.797, data_time: 0.043, memory: 18860, loss_rpn_cls: 0.0809, loss_rpn_bbox: 0.0929, loss_cls: 0.4033, acc: 90.2405, loss_bbox: 0.3450, loss_mask: 0.4626, loss: 1.3848 2024-05-30 22:06:30,927 - mmdet - INFO - Epoch [1][800/7330] lr: 1.000e-04, eta: 19:44:21, time: 0.800, data_time: 0.052, memory: 18871, loss_rpn_cls: 0.0807, loss_rpn_bbox: 0.0896, loss_cls: 0.3835, acc: 90.4768, loss_bbox: 0.3345, loss_mask: 0.4622, loss: 1.3505 2024-05-30 22:07:10,482 - mmdet - INFO - Epoch [1][850/7330] lr: 1.000e-04, eta: 19:41:39, time: 0.792, data_time: 0.042, memory: 18871, loss_rpn_cls: 0.0750, loss_rpn_bbox: 0.0869, loss_cls: 0.3582, acc: 90.8389, loss_bbox: 0.3278, loss_mask: 0.4351, loss: 1.2831 2024-05-30 22:07:49,993 - mmdet - INFO - Epoch [1][900/7330] lr: 1.000e-04, eta: 19:39:04, time: 0.790, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0707, loss_rpn_bbox: 0.0813, loss_cls: 0.3763, acc: 90.2083, loss_bbox: 0.3470, loss_mask: 0.4471, loss: 1.3223 2024-05-30 22:08:30,278 - mmdet - INFO - Epoch [1][950/7330] lr: 1.000e-04, eta: 19:37:52, time: 0.806, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0764, loss_rpn_bbox: 0.0901, loss_cls: 0.3736, acc: 90.1101, loss_bbox: 0.3490, loss_mask: 0.4326, loss: 1.3216 2024-05-30 22:09:09,859 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-30 22:09:09,859 - mmdet - INFO - Epoch [1][1000/7330] lr: 1.000e-04, eta: 19:35:41, time: 0.792, data_time: 0.041, memory: 18874, loss_rpn_cls: 0.0713, loss_rpn_bbox: 0.0827, loss_cls: 0.3572, acc: 90.2605, loss_bbox: 0.3464, loss_mask: 0.4281, loss: 1.2858 2024-05-30 22:09:49,276 - mmdet - INFO - Epoch [1][1050/7330] lr: 1.000e-04, eta: 19:33:26, time: 0.788, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0720, loss_rpn_bbox: 0.0865, loss_cls: 0.3655, acc: 89.9031, loss_bbox: 0.3534, loss_mask: 0.4244, loss: 1.3018 2024-05-30 22:10:29,566 - mmdet - INFO - Epoch [1][1100/7330] lr: 1.000e-04, eta: 19:32:29, time: 0.806, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0640, loss_rpn_bbox: 0.0772, loss_cls: 0.3640, acc: 90.0591, loss_bbox: 0.3591, loss_mask: 0.4284, loss: 1.2927 2024-05-30 22:11:09,645 - mmdet - INFO - Epoch [1][1150/7330] lr: 1.000e-04, eta: 19:31:17, time: 0.802, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0689, loss_rpn_bbox: 0.0809, loss_cls: 0.3572, acc: 90.1011, loss_bbox: 0.3595, loss_mask: 0.4203, loss: 1.2868 2024-05-30 22:11:48,969 - mmdet - INFO - Epoch [1][1200/7330] lr: 1.000e-04, eta: 19:29:13, time: 0.787, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0691, loss_rpn_bbox: 0.0848, loss_cls: 0.3590, acc: 89.9548, loss_bbox: 0.3557, loss_mask: 0.4128, loss: 1.2814 2024-05-30 22:12:28,946 - mmdet - INFO - Epoch [1][1250/7330] lr: 1.000e-04, eta: 19:28:01, time: 0.799, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0645, loss_rpn_bbox: 0.0874, loss_cls: 0.3667, acc: 89.6628, loss_bbox: 0.3695, loss_mask: 0.4097, loss: 1.2978 2024-05-30 22:13:08,644 - mmdet - INFO - Epoch [1][1300/7330] lr: 1.000e-04, eta: 19:26:33, time: 0.794, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0689, loss_rpn_bbox: 0.0855, loss_cls: 0.3417, acc: 90.2429, loss_bbox: 0.3498, loss_mask: 0.4072, loss: 1.2532 2024-05-30 22:13:47,990 - mmdet - INFO - Epoch [1][1350/7330] lr: 1.000e-04, eta: 19:24:46, time: 0.787, data_time: 0.040, memory: 18874, loss_rpn_cls: 0.0601, loss_rpn_bbox: 0.0803, loss_cls: 0.3443, acc: 90.0476, loss_bbox: 0.3536, loss_mask: 0.3963, loss: 1.2345 2024-05-30 22:14:28,514 - mmdet - INFO - Epoch [1][1400/7330] lr: 1.000e-04, eta: 19:24:16, time: 0.810, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0661, loss_rpn_bbox: 0.0829, loss_cls: 0.3424, acc: 89.9197, loss_bbox: 0.3534, loss_mask: 0.3977, loss: 1.2425 2024-05-30 22:15:08,544 - mmdet - INFO - Epoch [1][1450/7330] lr: 1.000e-04, eta: 19:23:17, time: 0.801, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0622, loss_rpn_bbox: 0.0857, loss_cls: 0.3387, acc: 90.0427, loss_bbox: 0.3618, loss_mask: 0.3969, loss: 1.2453 2024-05-30 22:15:48,402 - mmdet - INFO - Epoch [1][1500/7330] lr: 1.000e-04, eta: 19:22:09, time: 0.797, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0621, loss_rpn_bbox: 0.0830, loss_cls: 0.3338, acc: 89.9958, loss_bbox: 0.3519, loss_mask: 0.3891, loss: 1.2199 2024-05-30 22:16:28,182 - mmdet - INFO - Epoch [1][1550/7330] lr: 1.000e-04, eta: 19:20:58, time: 0.796, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0810, loss_cls: 0.3436, acc: 90.0271, loss_bbox: 0.3617, loss_mask: 0.3969, loss: 1.2409 2024-05-30 22:17:12,556 - mmdet - INFO - Epoch [1][1600/7330] lr: 1.000e-04, eta: 19:23:57, time: 0.888, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0613, loss_rpn_bbox: 0.0799, loss_cls: 0.3502, acc: 89.5706, loss_bbox: 0.3707, loss_mask: 0.3866, loss: 1.2486 2024-05-30 22:17:59,917 - mmdet - INFO - Epoch [1][1650/7330] lr: 1.000e-04, eta: 19:29:19, time: 0.947, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0575, loss_rpn_bbox: 0.0788, loss_cls: 0.3311, acc: 90.1965, loss_bbox: 0.3522, loss_mask: 0.3906, loss: 1.2101 2024-05-30 22:18:40,399 - mmdet - INFO - Epoch [1][1700/7330] lr: 1.000e-04, eta: 19:28:30, time: 0.810, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0667, loss_rpn_bbox: 0.0788, loss_cls: 0.3465, acc: 89.6265, loss_bbox: 0.3685, loss_mask: 0.3930, loss: 1.2535 2024-05-30 22:19:20,049 - mmdet - INFO - Epoch [1][1750/7330] lr: 1.000e-04, eta: 19:27:01, time: 0.793, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0803, loss_cls: 0.3319, acc: 89.9517, loss_bbox: 0.3586, loss_mask: 0.3802, loss: 1.2088 2024-05-30 22:19:59,521 - mmdet - INFO - Epoch [1][1800/7330] lr: 1.000e-04, eta: 19:25:26, time: 0.789, data_time: 0.039, memory: 18874, loss_rpn_cls: 0.0578, loss_rpn_bbox: 0.0757, loss_cls: 0.3270, acc: 90.0447, loss_bbox: 0.3525, loss_mask: 0.3803, loss: 1.1933 2024-05-30 22:20:39,119 - mmdet - INFO - Epoch [1][1850/7330] lr: 1.000e-04, eta: 19:24:00, time: 0.792, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0552, loss_rpn_bbox: 0.0785, loss_cls: 0.3332, acc: 90.0273, loss_bbox: 0.3473, loss_mask: 0.3796, loss: 1.1937 2024-05-30 22:21:19,048 - mmdet - INFO - Epoch [1][1900/7330] lr: 1.000e-04, eta: 19:22:51, time: 0.799, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0559, loss_rpn_bbox: 0.0766, loss_cls: 0.3260, acc: 89.9758, loss_bbox: 0.3592, loss_mask: 0.3708, loss: 1.1884 2024-05-30 22:21:59,251 - mmdet - INFO - Epoch [1][1950/7330] lr: 1.000e-04, eta: 19:21:55, time: 0.804, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0553, loss_rpn_bbox: 0.0831, loss_cls: 0.3333, acc: 90.0806, loss_bbox: 0.3474, loss_mask: 0.3687, loss: 1.1878 2024-05-30 22:22:38,925 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-30 22:22:38,925 - mmdet - INFO - Epoch [1][2000/7330] lr: 1.000e-04, eta: 19:20:38, time: 0.793, data_time: 0.039, memory: 18874, loss_rpn_cls: 0.0535, loss_rpn_bbox: 0.0728, loss_cls: 0.3163, acc: 90.3208, loss_bbox: 0.3427, loss_mask: 0.3679, loss: 1.1532 2024-05-30 22:23:18,977 - mmdet - INFO - Epoch [1][2050/7330] lr: 1.000e-04, eta: 19:19:39, time: 0.801, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0541, loss_rpn_bbox: 0.0741, loss_cls: 0.3266, acc: 89.8835, loss_bbox: 0.3606, loss_mask: 0.3698, loss: 1.1851 2024-05-30 22:23:58,951 - mmdet - INFO - Epoch [1][2100/7330] lr: 1.000e-04, eta: 19:18:37, time: 0.800, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0563, loss_rpn_bbox: 0.0768, loss_cls: 0.3147, acc: 90.3972, loss_bbox: 0.3385, loss_mask: 0.3718, loss: 1.1580 2024-05-30 22:24:39,378 - mmdet - INFO - Epoch [1][2150/7330] lr: 1.000e-04, eta: 19:17:54, time: 0.809, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0545, loss_rpn_bbox: 0.0772, loss_cls: 0.3119, acc: 90.4358, loss_bbox: 0.3368, loss_mask: 0.3704, loss: 1.1509 2024-05-30 22:25:19,378 - mmdet - INFO - Epoch [1][2200/7330] lr: 1.000e-04, eta: 19:16:55, time: 0.800, data_time: 0.038, memory: 18874, loss_rpn_cls: 0.0564, loss_rpn_bbox: 0.0753, loss_cls: 0.3128, acc: 90.1089, loss_bbox: 0.3471, loss_mask: 0.3582, loss: 1.1498 2024-05-30 22:25:59,413 - mmdet - INFO - Epoch [1][2250/7330] lr: 1.000e-04, eta: 19:15:58, time: 0.801, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0519, loss_rpn_bbox: 0.0717, loss_cls: 0.3100, acc: 90.4143, loss_bbox: 0.3376, loss_mask: 0.3621, loss: 1.1333 2024-05-30 22:26:39,759 - mmdet - INFO - Epoch [1][2300/7330] lr: 1.000e-04, eta: 19:15:13, time: 0.807, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0498, loss_rpn_bbox: 0.0738, loss_cls: 0.3141, acc: 90.2168, loss_bbox: 0.3478, loss_mask: 0.3617, loss: 1.1471 2024-05-30 22:27:19,066 - mmdet - INFO - Epoch [1][2350/7330] lr: 1.000e-04, eta: 19:13:51, time: 0.786, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0504, loss_rpn_bbox: 0.0729, loss_cls: 0.3041, acc: 90.6133, loss_bbox: 0.3375, loss_mask: 0.3546, loss: 1.1195 2024-05-30 22:27:59,514 - mmdet - INFO - Epoch [1][2400/7330] lr: 1.000e-04, eta: 19:13:11, time: 0.809, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0534, loss_rpn_bbox: 0.0787, loss_cls: 0.3169, acc: 89.9221, loss_bbox: 0.3638, loss_mask: 0.3613, loss: 1.1740 2024-05-30 22:28:39,854 - mmdet - INFO - Epoch [1][2450/7330] lr: 1.000e-04, eta: 19:12:27, time: 0.807, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0480, loss_rpn_bbox: 0.0739, loss_cls: 0.3012, acc: 90.7358, loss_bbox: 0.3321, loss_mask: 0.3527, loss: 1.1079 2024-05-30 22:29:23,347 - mmdet - INFO - Epoch [1][2500/7330] lr: 1.000e-04, eta: 19:13:31, time: 0.870, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0495, loss_rpn_bbox: 0.0738, loss_cls: 0.3116, acc: 90.3481, loss_bbox: 0.3467, loss_mask: 0.3578, loss: 1.1395 2024-05-30 22:30:13,668 - mmdet - INFO - Epoch [1][2550/7330] lr: 1.000e-04, eta: 19:18:20, time: 1.006, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0512, loss_rpn_bbox: 0.0766, loss_cls: 0.3216, acc: 89.9648, loss_bbox: 0.3533, loss_mask: 0.3550, loss: 1.1577 2024-05-30 22:30:53,735 - mmdet - INFO - Epoch [1][2600/7330] lr: 1.000e-04, eta: 19:17:19, time: 0.801, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0472, loss_rpn_bbox: 0.0702, loss_cls: 0.3013, acc: 90.5088, loss_bbox: 0.3375, loss_mask: 0.3588, loss: 1.1150 2024-05-30 22:31:34,162 - mmdet - INFO - Epoch [1][2650/7330] lr: 1.000e-04, eta: 19:16:30, time: 0.809, data_time: 0.039, memory: 18874, loss_rpn_cls: 0.0490, loss_rpn_bbox: 0.0741, loss_cls: 0.3133, acc: 90.1348, loss_bbox: 0.3487, loss_mask: 0.3539, loss: 1.1390 2024-05-30 22:32:13,789 - mmdet - INFO - Epoch [1][2700/7330] lr: 1.000e-04, eta: 19:15:16, time: 0.792, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0476, loss_rpn_bbox: 0.0712, loss_cls: 0.3036, acc: 90.3181, loss_bbox: 0.3443, loss_mask: 0.3472, loss: 1.1139 2024-05-30 22:32:53,730 - mmdet - INFO - Epoch [1][2750/7330] lr: 1.000e-04, eta: 19:14:14, time: 0.799, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0519, loss_rpn_bbox: 0.0745, loss_cls: 0.3023, acc: 90.3965, loss_bbox: 0.3379, loss_mask: 0.3532, loss: 1.1198 2024-05-30 22:33:33,604 - mmdet - INFO - Epoch [1][2800/7330] lr: 1.000e-04, eta: 19:13:10, time: 0.797, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0490, loss_rpn_bbox: 0.0706, loss_cls: 0.2967, acc: 90.6394, loss_bbox: 0.3368, loss_mask: 0.3383, loss: 1.0913 2024-05-30 22:34:13,492 - mmdet - INFO - Epoch [1][2850/7330] lr: 1.000e-04, eta: 19:12:07, time: 0.797, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0478, loss_rpn_bbox: 0.0713, loss_cls: 0.3053, acc: 90.5415, loss_bbox: 0.3364, loss_mask: 0.3470, loss: 1.1077 2024-05-30 22:34:53,183 - mmdet - INFO - Epoch [1][2900/7330] lr: 1.000e-04, eta: 19:11:00, time: 0.794, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0507, loss_rpn_bbox: 0.0691, loss_cls: 0.2981, acc: 90.5474, loss_bbox: 0.3308, loss_mask: 0.3429, loss: 1.0916 2024-05-30 22:35:33,225 - mmdet - INFO - Epoch [1][2950/7330] lr: 1.000e-04, eta: 19:10:03, time: 0.801, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0497, loss_rpn_bbox: 0.0739, loss_cls: 0.3132, acc: 89.9712, loss_bbox: 0.3573, loss_mask: 0.3493, loss: 1.1433 2024-05-30 22:36:13,093 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-30 22:36:13,093 - mmdet - INFO - Epoch [1][3000/7330] lr: 1.000e-04, eta: 19:09:02, time: 0.797, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0480, loss_rpn_bbox: 0.0701, loss_cls: 0.3099, acc: 90.2644, loss_bbox: 0.3460, loss_mask: 0.3430, loss: 1.1170 2024-05-30 22:36:53,181 - mmdet - INFO - Epoch [1][3050/7330] lr: 1.000e-04, eta: 19:08:08, time: 0.802, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0494, loss_rpn_bbox: 0.0760, loss_cls: 0.3081, acc: 90.1704, loss_bbox: 0.3483, loss_mask: 0.3402, loss: 1.1219 2024-05-30 22:37:33,278 - mmdet - INFO - Epoch [1][3100/7330] lr: 1.000e-04, eta: 19:07:15, time: 0.802, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0484, loss_rpn_bbox: 0.0711, loss_cls: 0.2974, acc: 90.4409, loss_bbox: 0.3369, loss_mask: 0.3386, loss: 1.0924 2024-05-30 22:38:13,186 - mmdet - INFO - Epoch [1][3150/7330] lr: 1.000e-04, eta: 19:06:17, time: 0.798, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0471, loss_rpn_bbox: 0.0724, loss_cls: 0.3065, acc: 90.2615, loss_bbox: 0.3474, loss_mask: 0.3411, loss: 1.1145 2024-05-30 22:38:52,401 - mmdet - INFO - Epoch [1][3200/7330] lr: 1.000e-04, eta: 19:05:01, time: 0.784, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0445, loss_rpn_bbox: 0.0695, loss_cls: 0.3080, acc: 90.4131, loss_bbox: 0.3331, loss_mask: 0.3433, loss: 1.0985 2024-05-30 22:39:32,499 - mmdet - INFO - Epoch [1][3250/7330] lr: 1.000e-04, eta: 19:04:09, time: 0.802, data_time: 0.038, memory: 18874, loss_rpn_cls: 0.0485, loss_rpn_bbox: 0.0674, loss_cls: 0.3031, acc: 90.2664, loss_bbox: 0.3399, loss_mask: 0.3366, loss: 1.0955 2024-05-30 22:40:12,728 - mmdet - INFO - Epoch [1][3300/7330] lr: 1.000e-04, eta: 19:03:21, time: 0.805, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0476, loss_rpn_bbox: 0.0800, loss_cls: 0.3245, acc: 89.7598, loss_bbox: 0.3531, loss_mask: 0.3489, loss: 1.1540 2024-05-30 22:40:52,967 - mmdet - INFO - Epoch [1][3350/7330] lr: 1.000e-04, eta: 19:02:34, time: 0.805, data_time: 0.039, memory: 18874, loss_rpn_cls: 0.0491, loss_rpn_bbox: 0.0728, loss_cls: 0.3083, acc: 90.3267, loss_bbox: 0.3389, loss_mask: 0.3427, loss: 1.1119 2024-05-30 22:41:34,599 - mmdet - INFO - Epoch [1][3400/7330] lr: 1.000e-04, eta: 19:02:21, time: 0.833, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0493, loss_rpn_bbox: 0.0769, loss_cls: 0.3064, acc: 90.2478, loss_bbox: 0.3483, loss_mask: 0.3398, loss: 1.1207 2024-05-30 22:42:25,550 - mmdet - INFO - Epoch [1][3450/7330] lr: 1.000e-04, eta: 19:05:56, time: 1.019, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0471, loss_rpn_bbox: 0.0729, loss_cls: 0.2991, acc: 90.2688, loss_bbox: 0.3451, loss_mask: 0.3402, loss: 1.1043 2024-05-30 22:43:05,835 - mmdet - INFO - Epoch [1][3500/7330] lr: 1.000e-04, eta: 19:05:06, time: 0.806, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0416, loss_rpn_bbox: 0.0675, loss_cls: 0.2842, acc: 90.7014, loss_bbox: 0.3310, loss_mask: 0.3354, loss: 1.0597 2024-05-30 22:43:45,467 - mmdet - INFO - Epoch [1][3550/7330] lr: 1.000e-04, eta: 19:04:00, time: 0.793, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0433, loss_rpn_bbox: 0.0665, loss_cls: 0.2983, acc: 90.6404, loss_bbox: 0.3320, loss_mask: 0.3326, loss: 1.0728 2024-05-30 22:44:24,890 - mmdet - INFO - Epoch [1][3600/7330] lr: 1.000e-04, eta: 19:02:50, time: 0.788, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0451, loss_rpn_bbox: 0.0649, loss_cls: 0.2926, acc: 90.5959, loss_bbox: 0.3343, loss_mask: 0.3322, loss: 1.0691 2024-05-30 22:45:04,707 - mmdet - INFO - Epoch [1][3650/7330] lr: 1.000e-04, eta: 19:01:51, time: 0.796, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0443, loss_rpn_bbox: 0.0692, loss_cls: 0.2818, acc: 90.8154, loss_bbox: 0.3290, loss_mask: 0.3297, loss: 1.0540 2024-05-30 22:45:44,762 - mmdet - INFO - Epoch [1][3700/7330] lr: 1.000e-04, eta: 19:00:57, time: 0.801, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0424, loss_rpn_bbox: 0.0681, loss_cls: 0.2823, acc: 90.7776, loss_bbox: 0.3285, loss_mask: 0.3282, loss: 1.0495 2024-05-30 22:46:25,456 - mmdet - INFO - Epoch [1][3750/7330] lr: 1.000e-04, eta: 19:00:18, time: 0.814, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0461, loss_rpn_bbox: 0.0716, loss_cls: 0.3001, acc: 90.3584, loss_bbox: 0.3382, loss_mask: 0.3323, loss: 1.0883 2024-05-30 22:47:05,199 - mmdet - INFO - Epoch [1][3800/7330] lr: 1.000e-04, eta: 18:59:18, time: 0.795, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0479, loss_rpn_bbox: 0.0674, loss_cls: 0.2952, acc: 90.4873, loss_bbox: 0.3369, loss_mask: 0.3344, loss: 1.0819 2024-05-30 22:47:44,924 - mmdet - INFO - Epoch [1][3850/7330] lr: 1.000e-04, eta: 18:58:18, time: 0.795, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0474, loss_rpn_bbox: 0.0709, loss_cls: 0.2928, acc: 90.6240, loss_bbox: 0.3331, loss_mask: 0.3295, loss: 1.0737 2024-05-30 22:48:24,853 - mmdet - INFO - Epoch [1][3900/7330] lr: 1.000e-04, eta: 18:57:23, time: 0.798, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0454, loss_rpn_bbox: 0.0702, loss_cls: 0.2868, acc: 90.8955, loss_bbox: 0.3227, loss_mask: 0.3345, loss: 1.0595 2024-05-30 22:49:05,578 - mmdet - INFO - Epoch [1][3950/7330] lr: 1.000e-04, eta: 18:56:45, time: 0.815, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0451, loss_rpn_bbox: 0.0702, loss_cls: 0.2903, acc: 90.4937, loss_bbox: 0.3340, loss_mask: 0.3277, loss: 1.0673 2024-05-30 22:49:46,018 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-30 22:49:46,018 - mmdet - INFO - Epoch [1][4000/7330] lr: 1.000e-04, eta: 18:56:01, time: 0.809, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0436, loss_rpn_bbox: 0.0663, loss_cls: 0.2950, acc: 90.3857, loss_bbox: 0.3426, loss_mask: 0.3358, loss: 1.0835 2024-05-30 22:50:26,562 - mmdet - INFO - Epoch [1][4050/7330] lr: 1.000e-04, eta: 18:55:20, time: 0.811, data_time: 0.040, memory: 18874, loss_rpn_cls: 0.0410, loss_rpn_bbox: 0.0679, loss_cls: 0.2851, acc: 90.7256, loss_bbox: 0.3282, loss_mask: 0.3285, loss: 1.0506 2024-05-30 22:51:07,914 - mmdet - INFO - Epoch [1][4100/7330] lr: 1.000e-04, eta: 18:54:54, time: 0.827, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0471, loss_rpn_bbox: 0.0731, loss_cls: 0.2836, acc: 90.7390, loss_bbox: 0.3304, loss_mask: 0.3287, loss: 1.0629 2024-05-30 22:51:47,199 - mmdet - INFO - Epoch [1][4150/7330] lr: 1.000e-04, eta: 18:53:47, time: 0.786, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0405, loss_rpn_bbox: 0.0675, loss_cls: 0.2856, acc: 90.6650, loss_bbox: 0.3318, loss_mask: 0.3255, loss: 1.0509 2024-05-30 22:52:27,110 - mmdet - INFO - Epoch [1][4200/7330] lr: 1.000e-04, eta: 18:52:53, time: 0.798, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0408, loss_rpn_bbox: 0.0640, loss_cls: 0.2806, acc: 90.6716, loss_bbox: 0.3283, loss_mask: 0.3182, loss: 1.0319 2024-05-30 22:53:06,949 - mmdet - INFO - Epoch [1][4250/7330] lr: 1.000e-04, eta: 18:51:58, time: 0.797, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0399, loss_rpn_bbox: 0.0681, loss_cls: 0.2928, acc: 90.3936, loss_bbox: 0.3369, loss_mask: 0.3320, loss: 1.0697 2024-05-30 22:53:46,934 - mmdet - INFO - Epoch [1][4300/7330] lr: 1.000e-04, eta: 18:51:06, time: 0.800, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0412, loss_rpn_bbox: 0.0671, loss_cls: 0.2758, acc: 91.0027, loss_bbox: 0.3202, loss_mask: 0.3224, loss: 1.0267 2024-05-30 22:54:39,632 - mmdet - INFO - Epoch [1][4350/7330] lr: 1.000e-04, eta: 18:54:19, time: 1.054, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0456, loss_rpn_bbox: 0.0699, loss_cls: 0.2937, acc: 90.4353, loss_bbox: 0.3323, loss_mask: 0.3311, loss: 1.0726 2024-05-30 22:55:19,497 - mmdet - INFO - Epoch [1][4400/7330] lr: 1.000e-04, eta: 18:53:22, time: 0.797, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0410, loss_rpn_bbox: 0.0666, loss_cls: 0.2878, acc: 90.5046, loss_bbox: 0.3328, loss_mask: 0.3181, loss: 1.0463 2024-05-30 22:55:59,341 - mmdet - INFO - Epoch [1][4450/7330] lr: 1.000e-04, eta: 18:52:26, time: 0.797, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0413, loss_rpn_bbox: 0.0704, loss_cls: 0.2831, acc: 90.5007, loss_bbox: 0.3325, loss_mask: 0.3258, loss: 1.0530 2024-05-30 22:56:39,132 - mmdet - INFO - Epoch [1][4500/7330] lr: 1.000e-04, eta: 18:51:29, time: 0.796, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0426, loss_rpn_bbox: 0.0695, loss_cls: 0.2881, acc: 90.5825, loss_bbox: 0.3288, loss_mask: 0.3273, loss: 1.0564 2024-05-30 22:57:18,983 - mmdet - INFO - Epoch [1][4550/7330] lr: 1.000e-04, eta: 18:50:33, time: 0.797, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0443, loss_rpn_bbox: 0.0682, loss_cls: 0.2984, acc: 90.1277, loss_bbox: 0.3406, loss_mask: 0.3273, loss: 1.0788 2024-05-30 22:57:58,505 - mmdet - INFO - Epoch [1][4600/7330] lr: 1.000e-04, eta: 18:49:32, time: 0.790, data_time: 0.040, memory: 18874, loss_rpn_cls: 0.0428, loss_rpn_bbox: 0.0647, loss_cls: 0.2864, acc: 90.6826, loss_bbox: 0.3329, loss_mask: 0.3258, loss: 1.0526 2024-05-30 22:58:38,558 - mmdet - INFO - Epoch [1][4650/7330] lr: 1.000e-04, eta: 18:48:40, time: 0.801, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0399, loss_rpn_bbox: 0.0664, loss_cls: 0.2780, acc: 90.7793, loss_bbox: 0.3261, loss_mask: 0.3153, loss: 1.0257 2024-05-30 22:59:18,126 - mmdet - INFO - Epoch [1][4700/7330] lr: 1.000e-04, eta: 18:47:40, time: 0.791, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0400, loss_rpn_bbox: 0.0646, loss_cls: 0.2763, acc: 90.8308, loss_bbox: 0.3298, loss_mask: 0.3159, loss: 1.0266 2024-05-30 22:59:58,327 - mmdet - INFO - Epoch [1][4750/7330] lr: 1.000e-04, eta: 18:46:52, time: 0.804, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0449, loss_rpn_bbox: 0.0692, loss_cls: 0.2851, acc: 90.6235, loss_bbox: 0.3320, loss_mask: 0.3289, loss: 1.0601 2024-05-30 23:00:38,138 - mmdet - INFO - Epoch [1][4800/7330] lr: 1.000e-04, eta: 18:45:58, time: 0.796, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0394, loss_rpn_bbox: 0.0625, loss_cls: 0.2797, acc: 90.7732, loss_bbox: 0.3294, loss_mask: 0.3217, loss: 1.0328 2024-05-30 23:01:18,428 - mmdet - INFO - Epoch [1][4850/7330] lr: 1.000e-04, eta: 18:45:11, time: 0.806, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0398, loss_rpn_bbox: 0.0654, loss_cls: 0.2749, acc: 91.1240, loss_bbox: 0.3165, loss_mask: 0.3155, loss: 1.0120 2024-05-30 23:01:58,158 - mmdet - INFO - Epoch [1][4900/7330] lr: 1.000e-04, eta: 18:44:16, time: 0.795, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0419, loss_rpn_bbox: 0.0679, loss_cls: 0.2831, acc: 90.7568, loss_bbox: 0.3280, loss_mask: 0.3257, loss: 1.0466 2024-05-30 23:02:38,173 - mmdet - INFO - Epoch [1][4950/7330] lr: 1.000e-04, eta: 18:43:25, time: 0.800, data_time: 0.039, memory: 18874, loss_rpn_cls: 0.0403, loss_rpn_bbox: 0.0661, loss_cls: 0.2802, acc: 90.9150, loss_bbox: 0.3268, loss_mask: 0.3114, loss: 1.0248 2024-05-30 23:03:17,957 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-30 23:03:17,957 - mmdet - INFO - Epoch [1][5000/7330] lr: 1.000e-04, eta: 18:42:31, time: 0.796, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0402, loss_rpn_bbox: 0.0680, loss_cls: 0.2749, acc: 90.9536, loss_bbox: 0.3212, loss_mask: 0.3279, loss: 1.0323 2024-05-30 23:03:57,876 - mmdet - INFO - Epoch [1][5050/7330] lr: 1.000e-04, eta: 18:41:39, time: 0.798, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0380, loss_rpn_bbox: 0.0665, loss_cls: 0.2665, acc: 91.2063, loss_bbox: 0.3166, loss_mask: 0.3152, loss: 1.0029 2024-05-30 23:04:38,157 - mmdet - INFO - Epoch [1][5100/7330] lr: 1.000e-04, eta: 18:40:54, time: 0.806, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0411, loss_rpn_bbox: 0.0662, loss_cls: 0.2786, acc: 90.7463, loss_bbox: 0.3246, loss_mask: 0.3177, loss: 1.0283 2024-05-30 23:05:17,714 - mmdet - INFO - Epoch [1][5150/7330] lr: 1.000e-04, eta: 18:39:57, time: 0.791, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0382, loss_rpn_bbox: 0.0603, loss_cls: 0.2694, acc: 91.1704, loss_bbox: 0.3134, loss_mask: 0.3106, loss: 0.9918 2024-05-30 23:05:57,233 - mmdet - INFO - Epoch [1][5200/7330] lr: 1.000e-04, eta: 18:38:59, time: 0.790, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0372, loss_rpn_bbox: 0.0618, loss_cls: 0.2755, acc: 90.9670, loss_bbox: 0.3245, loss_mask: 0.3074, loss: 1.0064 2024-05-30 23:06:50,419 - mmdet - INFO - Epoch [1][5250/7330] lr: 1.000e-04, eta: 18:41:38, time: 1.064, data_time: 0.040, memory: 18874, loss_rpn_cls: 0.0411, loss_rpn_bbox: 0.0647, loss_cls: 0.2871, acc: 90.5920, loss_bbox: 0.3308, loss_mask: 0.3133, loss: 1.0369 2024-05-30 23:07:30,452 - mmdet - INFO - Epoch [1][5300/7330] lr: 1.000e-04, eta: 18:40:47, time: 0.801, data_time: 0.040, memory: 18874, loss_rpn_cls: 0.0387, loss_rpn_bbox: 0.0633, loss_cls: 0.2767, acc: 90.9629, loss_bbox: 0.3216, loss_mask: 0.3140, loss: 1.0143 2024-05-30 23:08:09,767 - mmdet - INFO - Epoch [1][5350/7330] lr: 1.000e-04, eta: 18:39:45, time: 0.786, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0390, loss_rpn_bbox: 0.0624, loss_cls: 0.2731, acc: 90.8354, loss_bbox: 0.3249, loss_mask: 0.3147, loss: 1.0142 2024-05-30 23:08:49,578 - mmdet - INFO - Epoch [1][5400/7330] lr: 1.000e-04, eta: 18:38:51, time: 0.796, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0427, loss_rpn_bbox: 0.0671, loss_cls: 0.2696, acc: 90.9719, loss_bbox: 0.3227, loss_mask: 0.3123, loss: 1.0144 2024-05-30 23:09:29,841 - mmdet - INFO - Epoch [1][5450/7330] lr: 1.000e-04, eta: 18:38:05, time: 0.805, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0407, loss_rpn_bbox: 0.0620, loss_cls: 0.2796, acc: 90.7756, loss_bbox: 0.3286, loss_mask: 0.3123, loss: 1.0233 2024-05-30 23:10:09,292 - mmdet - INFO - Epoch [1][5500/7330] lr: 1.000e-04, eta: 18:37:06, time: 0.789, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0354, loss_rpn_bbox: 0.0636, loss_cls: 0.2716, acc: 91.1257, loss_bbox: 0.3177, loss_mask: 0.3036, loss: 0.9918 2024-05-30 23:10:49,434 - mmdet - INFO - Epoch [1][5550/7330] lr: 1.000e-04, eta: 18:36:18, time: 0.803, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0400, loss_rpn_bbox: 0.0626, loss_cls: 0.2775, acc: 90.8464, loss_bbox: 0.3197, loss_mask: 0.3146, loss: 1.0144 2024-05-30 23:11:29,946 - mmdet - INFO - Epoch [1][5600/7330] lr: 1.000e-04, eta: 18:35:35, time: 0.810, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0416, loss_rpn_bbox: 0.0645, loss_cls: 0.2791, acc: 90.6533, loss_bbox: 0.3295, loss_mask: 0.3111, loss: 1.0257 2024-05-30 23:12:10,505 - mmdet - INFO - Epoch [1][5650/7330] lr: 1.000e-04, eta: 18:34:54, time: 0.811, data_time: 0.041, memory: 18874, loss_rpn_cls: 0.0376, loss_rpn_bbox: 0.0677, loss_cls: 0.2650, acc: 91.3518, loss_bbox: 0.3109, loss_mask: 0.3087, loss: 0.9900 2024-05-30 23:12:50,498 - mmdet - INFO - Epoch [1][5700/7330] lr: 1.000e-04, eta: 18:34:04, time: 0.800, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0393, loss_rpn_bbox: 0.0648, loss_cls: 0.2718, acc: 90.9866, loss_bbox: 0.3142, loss_mask: 0.3109, loss: 1.0011 2024-05-30 23:13:30,462 - mmdet - INFO - Epoch [1][5750/7330] lr: 1.000e-04, eta: 18:33:14, time: 0.799, data_time: 0.041, memory: 18874, loss_rpn_cls: 0.0371, loss_rpn_bbox: 0.0637, loss_cls: 0.2719, acc: 90.9802, loss_bbox: 0.3201, loss_mask: 0.3149, loss: 1.0078 2024-05-30 23:14:09,589 - mmdet - INFO - Epoch [1][5800/7330] lr: 1.000e-04, eta: 18:32:12, time: 0.783, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0383, loss_rpn_bbox: 0.0616, loss_cls: 0.2624, acc: 91.2798, loss_bbox: 0.3106, loss_mask: 0.3060, loss: 0.9789 2024-05-30 23:14:49,707 - mmdet - INFO - Epoch [1][5850/7330] lr: 1.000e-04, eta: 18:31:24, time: 0.802, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0445, loss_rpn_bbox: 0.0685, loss_cls: 0.2748, acc: 90.8140, loss_bbox: 0.3210, loss_mask: 0.3101, loss: 1.0189 2024-05-30 23:15:29,436 - mmdet - INFO - Epoch [1][5900/7330] lr: 1.000e-04, eta: 18:30:31, time: 0.795, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0418, loss_rpn_bbox: 0.0667, loss_cls: 0.2717, acc: 91.0796, loss_bbox: 0.3188, loss_mask: 0.3186, loss: 1.0175 2024-05-30 23:16:09,190 - mmdet - INFO - Epoch [1][5950/7330] lr: 1.000e-04, eta: 18:29:39, time: 0.795, data_time: 0.041, memory: 18874, loss_rpn_cls: 0.0379, loss_rpn_bbox: 0.0634, loss_cls: 0.2537, acc: 91.4492, loss_bbox: 0.3051, loss_mask: 0.3091, loss: 0.9690 2024-05-30 23:16:48,845 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-30 23:16:48,845 - mmdet - INFO - Epoch [1][6000/7330] lr: 1.000e-04, eta: 18:28:46, time: 0.793, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0423, loss_rpn_bbox: 0.0695, loss_cls: 0.2668, acc: 91.2283, loss_bbox: 0.3139, loss_mask: 0.3142, loss: 1.0067 2024-05-30 23:17:28,857 - mmdet - INFO - Epoch [1][6050/7330] lr: 1.000e-04, eta: 18:27:57, time: 0.800, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0377, loss_rpn_bbox: 0.0632, loss_cls: 0.2667, acc: 91.0571, loss_bbox: 0.3189, loss_mask: 0.3026, loss: 0.9891 2024-05-30 23:18:08,456 - mmdet - INFO - Epoch [1][6100/7330] lr: 1.000e-04, eta: 18:27:04, time: 0.792, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0368, loss_rpn_bbox: 0.0655, loss_cls: 0.2741, acc: 90.7478, loss_bbox: 0.3260, loss_mask: 0.3063, loss: 1.0086 2024-05-30 23:18:56,623 - mmdet - INFO - Epoch [1][6150/7330] lr: 1.000e-04, eta: 18:28:04, time: 0.963, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0360, loss_rpn_bbox: 0.0630, loss_cls: 0.2714, acc: 90.9951, loss_bbox: 0.3220, loss_mask: 0.3044, loss: 0.9967 2024-05-30 23:19:40,582 - mmdet - INFO - Epoch [1][6200/7330] lr: 1.000e-04, eta: 18:28:07, time: 0.879, data_time: 0.041, memory: 18874, loss_rpn_cls: 0.0346, loss_rpn_bbox: 0.0594, loss_cls: 0.2622, acc: 91.1858, loss_bbox: 0.3108, loss_mask: 0.3017, loss: 0.9687 2024-05-30 23:20:20,359 - mmdet - INFO - Epoch [1][6250/7330] lr: 1.000e-04, eta: 18:27:15, time: 0.796, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0370, loss_rpn_bbox: 0.0569, loss_cls: 0.2614, acc: 91.4277, loss_bbox: 0.3022, loss_mask: 0.2974, loss: 0.9549 2024-05-30 23:21:00,032 - mmdet - INFO - Epoch [1][6300/7330] lr: 1.000e-04, eta: 18:26:22, time: 0.793, data_time: 0.034, memory: 18874, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0589, loss_cls: 0.2526, acc: 91.7502, loss_bbox: 0.2972, loss_mask: 0.3105, loss: 0.9537 2024-05-30 23:21:40,204 - mmdet - INFO - Epoch [1][6350/7330] lr: 1.000e-04, eta: 18:25:35, time: 0.804, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0378, loss_rpn_bbox: 0.0628, loss_cls: 0.2648, acc: 91.0581, loss_bbox: 0.3182, loss_mask: 0.3088, loss: 0.9925 2024-05-30 23:22:20,272 - mmdet - INFO - Epoch [1][6400/7330] lr: 1.000e-04, eta: 18:24:47, time: 0.801, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0368, loss_rpn_bbox: 0.0593, loss_cls: 0.2475, acc: 91.7498, loss_bbox: 0.2950, loss_mask: 0.3005, loss: 0.9390 2024-05-30 23:23:00,014 - mmdet - INFO - Epoch [1][6450/7330] lr: 1.000e-04, eta: 18:23:55, time: 0.795, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0366, loss_rpn_bbox: 0.0598, loss_cls: 0.2619, acc: 91.1238, loss_bbox: 0.3106, loss_mask: 0.2990, loss: 0.9679 2024-05-30 23:23:39,833 - mmdet - INFO - Epoch [1][6500/7330] lr: 1.000e-04, eta: 18:23:04, time: 0.796, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0385, loss_rpn_bbox: 0.0628, loss_cls: 0.2741, acc: 90.9106, loss_bbox: 0.3230, loss_mask: 0.3084, loss: 1.0068 2024-05-30 23:24:20,220 - mmdet - INFO - Epoch [1][6550/7330] lr: 1.000e-04, eta: 18:22:21, time: 0.808, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0398, loss_rpn_bbox: 0.0634, loss_cls: 0.2787, acc: 90.7844, loss_bbox: 0.3221, loss_mask: 0.3171, loss: 1.0211 2024-05-30 23:25:00,590 - mmdet - INFO - Epoch [1][6600/7330] lr: 1.000e-04, eta: 18:21:37, time: 0.807, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0608, loss_cls: 0.2609, acc: 91.2329, loss_bbox: 0.3039, loss_mask: 0.3062, loss: 0.9654 2024-05-30 23:25:40,415 - mmdet - INFO - Epoch [1][6650/7330] lr: 1.000e-04, eta: 18:20:47, time: 0.797, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0403, loss_rpn_bbox: 0.0637, loss_cls: 0.2539, acc: 91.4390, loss_bbox: 0.3067, loss_mask: 0.3062, loss: 0.9708 2024-05-30 23:26:20,603 - mmdet - INFO - Epoch [1][6700/7330] lr: 1.000e-04, eta: 18:20:01, time: 0.804, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0380, loss_rpn_bbox: 0.0622, loss_cls: 0.2645, acc: 91.1855, loss_bbox: 0.3147, loss_mask: 0.3049, loss: 0.9844 2024-05-30 23:27:00,578 - mmdet - INFO - Epoch [1][6750/7330] lr: 1.000e-04, eta: 18:19:13, time: 0.799, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0382, loss_rpn_bbox: 0.0603, loss_cls: 0.2563, acc: 91.3625, loss_bbox: 0.3039, loss_mask: 0.3104, loss: 0.9690 2024-05-30 23:27:40,141 - mmdet - INFO - Epoch [1][6800/7330] lr: 1.000e-04, eta: 18:18:20, time: 0.791, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0359, loss_rpn_bbox: 0.0622, loss_cls: 0.2703, acc: 91.0649, loss_bbox: 0.3156, loss_mask: 0.3045, loss: 0.9884 2024-05-30 23:28:19,797 - mmdet - INFO - Epoch [1][6850/7330] lr: 1.000e-04, eta: 18:17:28, time: 0.793, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0368, loss_rpn_bbox: 0.0618, loss_cls: 0.2629, acc: 91.1289, loss_bbox: 0.3123, loss_mask: 0.3034, loss: 0.9772 2024-05-30 23:28:59,612 - mmdet - INFO - Epoch [1][6900/7330] lr: 1.000e-04, eta: 18:16:38, time: 0.796, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0375, loss_rpn_bbox: 0.0604, loss_cls: 0.2636, acc: 91.0752, loss_bbox: 0.3084, loss_mask: 0.3036, loss: 0.9736 2024-05-30 23:29:39,485 - mmdet - INFO - Epoch [1][6950/7330] lr: 1.000e-04, eta: 18:15:49, time: 0.797, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0389, loss_rpn_bbox: 0.0619, loss_cls: 0.2524, acc: 91.7268, loss_bbox: 0.2984, loss_mask: 0.2944, loss: 0.9459 2024-05-30 23:30:19,397 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-30 23:30:19,398 - mmdet - INFO - Epoch [1][7000/7330] lr: 1.000e-04, eta: 18:15:01, time: 0.798, data_time: 0.038, memory: 18874, loss_rpn_cls: 0.0368, loss_rpn_bbox: 0.0597, loss_cls: 0.2581, acc: 91.2109, loss_bbox: 0.3126, loss_mask: 0.3020, loss: 0.9693 2024-05-30 23:31:05,419 - mmdet - INFO - Epoch [1][7050/7330] lr: 1.000e-04, eta: 18:15:23, time: 0.920, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0405, loss_rpn_bbox: 0.0635, loss_cls: 0.2657, acc: 90.9973, loss_bbox: 0.3178, loss_mask: 0.3063, loss: 0.9939 2024-05-30 23:31:52,125 - mmdet - INFO - Epoch [1][7100/7330] lr: 1.000e-04, eta: 18:15:52, time: 0.934, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0372, loss_rpn_bbox: 0.0619, loss_cls: 0.2587, acc: 91.4229, loss_bbox: 0.3019, loss_mask: 0.2970, loss: 0.9567 2024-05-30 23:32:31,690 - mmdet - INFO - Epoch [1][7150/7330] lr: 1.000e-04, eta: 18:14:59, time: 0.791, data_time: 0.038, memory: 18874, loss_rpn_cls: 0.0359, loss_rpn_bbox: 0.0603, loss_cls: 0.2573, acc: 91.4028, loss_bbox: 0.2961, loss_mask: 0.2987, loss: 0.9483 2024-05-30 23:33:11,841 - mmdet - INFO - Epoch [1][7200/7330] lr: 1.000e-04, eta: 18:14:12, time: 0.803, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0368, loss_rpn_bbox: 0.0623, loss_cls: 0.2754, acc: 90.7830, loss_bbox: 0.3241, loss_mask: 0.2961, loss: 0.9947 2024-05-30 23:33:52,149 - mmdet - INFO - Epoch [1][7250/7330] lr: 1.000e-04, eta: 18:13:28, time: 0.806, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0415, loss_rpn_bbox: 0.0644, loss_cls: 0.2630, acc: 91.0957, loss_bbox: 0.3116, loss_mask: 0.3065, loss: 0.9869 2024-05-30 23:34:32,326 - mmdet - INFO - Epoch [1][7300/7330] lr: 1.000e-04, eta: 18:12:42, time: 0.804, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0379, loss_rpn_bbox: 0.0648, loss_cls: 0.2722, acc: 90.8765, loss_bbox: 0.3178, loss_mask: 0.2993, loss: 0.9920 2024-05-30 23:34:57,274 - mmdet - INFO - Saving checkpoint at 1 epochs 2024-05-30 23:36:45,482 - mmdet - INFO - Evaluating bbox... 2024-05-30 23:37:14,042 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.319 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.578 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.319 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.185 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.354 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.445 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.447 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.447 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.447 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.264 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.491 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.602 2024-05-30 23:37:14,042 - mmdet - INFO - Evaluating segm... 2024-05-30 23:37:46,788 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.315 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.544 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.320 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.122 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.340 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.518 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.426 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.426 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.426 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.210 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.472 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.623 2024-05-30 23:37:47,199 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-30 23:37:47,199 - mmdet - INFO - Epoch(val) [1][625] bbox_mAP: 0.3190, bbox_mAP_50: 0.5780, bbox_mAP_75: 0.3190, bbox_mAP_s: 0.1850, bbox_mAP_m: 0.3540, bbox_mAP_l: 0.4450, bbox_mAP_copypaste: 0.319 0.578 0.319 0.185 0.354 0.445, segm_mAP: 0.3150, segm_mAP_50: 0.5440, segm_mAP_75: 0.3200, segm_mAP_s: 0.1220, segm_mAP_m: 0.3400, segm_mAP_l: 0.5180, segm_mAP_copypaste: 0.315 0.544 0.320 0.122 0.340 0.518 2024-05-30 23:38:33,954 - mmdet - INFO - Epoch [2][50/7330] lr: 1.000e-04, eta: 18:08:17, time: 0.935, data_time: 0.121, memory: 18874, loss_rpn_cls: 0.0361, loss_rpn_bbox: 0.0613, loss_cls: 0.2574, acc: 91.3013, loss_bbox: 0.3009, loss_mask: 0.3019, loss: 0.9576 2024-05-30 23:39:15,373 - mmdet - INFO - Epoch [2][100/7330] lr: 1.000e-04, eta: 18:07:47, time: 0.828, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0384, loss_rpn_bbox: 0.0656, loss_cls: 0.2713, acc: 90.7419, loss_bbox: 0.3240, loss_mask: 0.3052, loss: 1.0047 2024-05-30 23:39:56,491 - mmdet - INFO - Epoch [2][150/7330] lr: 1.000e-04, eta: 18:07:12, time: 0.822, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0623, loss_cls: 0.2588, acc: 91.1609, loss_bbox: 0.3106, loss_mask: 0.3019, loss: 0.9682 2024-05-30 23:40:36,440 - mmdet - INFO - Epoch [2][200/7330] lr: 1.000e-04, eta: 18:06:26, time: 0.799, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0315, loss_rpn_bbox: 0.0584, loss_cls: 0.2465, acc: 91.3813, loss_bbox: 0.3048, loss_mask: 0.2905, loss: 0.9317 2024-05-30 23:41:16,059 - mmdet - INFO - Epoch [2][250/7330] lr: 1.000e-04, eta: 18:05:36, time: 0.792, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0572, loss_cls: 0.2494, acc: 91.6394, loss_bbox: 0.2977, loss_mask: 0.2925, loss: 0.9292 2024-05-30 23:41:56,562 - mmdet - INFO - Epoch [2][300/7330] lr: 1.000e-04, eta: 18:04:55, time: 0.810, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0357, loss_rpn_bbox: 0.0637, loss_cls: 0.2591, acc: 91.1550, loss_bbox: 0.3163, loss_mask: 0.2988, loss: 0.9737 2024-05-30 23:42:37,028 - mmdet - INFO - Epoch [2][350/7330] lr: 1.000e-04, eta: 18:04:14, time: 0.809, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0596, loss_cls: 0.2438, acc: 91.5896, loss_bbox: 0.2949, loss_mask: 0.2962, loss: 0.9290 2024-05-30 23:43:18,088 - mmdet - INFO - Epoch [2][400/7330] lr: 1.000e-04, eta: 18:03:39, time: 0.821, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0600, loss_cls: 0.2493, acc: 91.4292, loss_bbox: 0.2991, loss_mask: 0.2973, loss: 0.9392 2024-05-30 23:43:58,440 - mmdet - INFO - Epoch [2][450/7330] lr: 1.000e-04, eta: 18:02:57, time: 0.807, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0361, loss_rpn_bbox: 0.0587, loss_cls: 0.2332, acc: 92.1587, loss_bbox: 0.2874, loss_mask: 0.2921, loss: 0.9074 2024-05-30 23:44:38,621 - mmdet - INFO - Epoch [2][500/7330] lr: 1.000e-04, eta: 18:02:13, time: 0.804, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0362, loss_rpn_bbox: 0.0612, loss_cls: 0.2564, acc: 91.2180, loss_bbox: 0.3076, loss_mask: 0.3007, loss: 0.9621 2024-05-30 23:45:18,369 - mmdet - INFO - Epoch [2][550/7330] lr: 1.000e-04, eta: 18:01:24, time: 0.795, data_time: 0.041, memory: 18874, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0541, loss_cls: 0.2335, acc: 92.1172, loss_bbox: 0.2803, loss_mask: 0.2861, loss: 0.8852 2024-05-30 23:45:59,188 - mmdet - INFO - Epoch [2][600/7330] lr: 1.000e-04, eta: 18:00:47, time: 0.816, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0350, loss_rpn_bbox: 0.0607, loss_cls: 0.2551, acc: 91.3164, loss_bbox: 0.2997, loss_mask: 0.2908, loss: 0.9414 2024-05-30 23:46:39,411 - mmdet - INFO - Epoch [2][650/7330] lr: 1.000e-04, eta: 18:00:03, time: 0.804, data_time: 0.038, memory: 18874, loss_rpn_cls: 0.0331, loss_rpn_bbox: 0.0577, loss_cls: 0.2455, acc: 91.6697, loss_bbox: 0.2946, loss_mask: 0.2926, loss: 0.9235 2024-05-30 23:47:19,320 - mmdet - INFO - Epoch [2][700/7330] lr: 1.000e-04, eta: 17:59:17, time: 0.798, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0570, loss_cls: 0.2455, acc: 91.7263, loss_bbox: 0.2971, loss_mask: 0.2902, loss: 0.9244 2024-05-30 23:48:00,141 - mmdet - INFO - Epoch [2][750/7330] lr: 1.000e-04, eta: 17:58:39, time: 0.816, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0350, loss_rpn_bbox: 0.0604, loss_cls: 0.2523, acc: 91.4763, loss_bbox: 0.3007, loss_mask: 0.2968, loss: 0.9453 2024-05-30 23:48:40,303 - mmdet - INFO - Epoch [2][800/7330] lr: 1.000e-04, eta: 17:57:56, time: 0.803, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0356, loss_rpn_bbox: 0.0608, loss_cls: 0.2522, acc: 91.5691, loss_bbox: 0.2992, loss_mask: 0.2907, loss: 0.9385 2024-05-30 23:49:20,715 - mmdet - INFO - Epoch [2][850/7330] lr: 1.000e-04, eta: 17:57:14, time: 0.808, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0346, loss_rpn_bbox: 0.0598, loss_cls: 0.2492, acc: 91.5386, loss_bbox: 0.3014, loss_mask: 0.2925, loss: 0.9375 2024-05-30 23:50:01,589 - mmdet - INFO - Epoch [2][900/7330] lr: 1.000e-04, eta: 17:56:37, time: 0.818, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0367, loss_rpn_bbox: 0.0610, loss_cls: 0.2661, acc: 90.8306, loss_bbox: 0.3222, loss_mask: 0.3063, loss: 0.9923 2024-05-30 23:50:41,936 - mmdet - INFO - Epoch [2][950/7330] lr: 1.000e-04, eta: 17:55:55, time: 0.807, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0337, loss_rpn_bbox: 0.0584, loss_cls: 0.2494, acc: 91.5159, loss_bbox: 0.2969, loss_mask: 0.2934, loss: 0.9318 2024-05-30 23:51:21,904 - mmdet - INFO - Epoch [2][1000/7330] lr: 1.000e-04, eta: 17:55:09, time: 0.799, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0330, loss_rpn_bbox: 0.0596, loss_cls: 0.2467, acc: 91.5801, loss_bbox: 0.2982, loss_mask: 0.2947, loss: 0.9322 2024-05-30 23:52:05,014 - mmdet - INFO - Epoch [2][1050/7330] lr: 1.000e-04, eta: 17:54:54, time: 0.862, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0343, loss_rpn_bbox: 0.0579, loss_cls: 0.2460, acc: 91.6621, loss_bbox: 0.2942, loss_mask: 0.2878, loss: 0.9202 2024-05-30 23:52:59,672 - mmdet - INFO - Epoch [2][1100/7330] lr: 1.000e-04, eta: 17:56:26, time: 1.093, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0333, loss_rpn_bbox: 0.0598, loss_cls: 0.2474, acc: 91.3997, loss_bbox: 0.3072, loss_mask: 0.2935, loss: 0.9412 2024-05-30 23:53:39,975 - mmdet - INFO - Epoch [2][1150/7330] lr: 1.000e-04, eta: 17:55:43, time: 0.806, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0564, loss_cls: 0.2340, acc: 91.9817, loss_bbox: 0.2815, loss_mask: 0.2880, loss: 0.8910 2024-05-30 23:54:20,296 - mmdet - INFO - Epoch [2][1200/7330] lr: 1.000e-04, eta: 17:55:00, time: 0.806, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0372, loss_rpn_bbox: 0.0617, loss_cls: 0.2507, acc: 91.4734, loss_bbox: 0.3030, loss_mask: 0.2895, loss: 0.9422 2024-05-30 23:55:00,293 - mmdet - INFO - Epoch [2][1250/7330] lr: 1.000e-04, eta: 17:54:14, time: 0.800, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0578, loss_cls: 0.2330, acc: 92.0391, loss_bbox: 0.2864, loss_mask: 0.2774, loss: 0.8893 2024-05-30 23:55:40,707 - mmdet - INFO - Epoch [2][1300/7330] lr: 1.000e-04, eta: 17:53:31, time: 0.808, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0312, loss_rpn_bbox: 0.0586, loss_cls: 0.2407, acc: 91.7847, loss_bbox: 0.2906, loss_mask: 0.2923, loss: 0.9133 2024-05-30 23:56:22,157 - mmdet - INFO - Epoch [2][1350/7330] lr: 1.000e-04, eta: 17:52:58, time: 0.829, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0591, loss_cls: 0.2557, acc: 91.2800, loss_bbox: 0.3059, loss_mask: 0.2929, loss: 0.9471 2024-05-30 23:57:02,543 - mmdet - INFO - Epoch [2][1400/7330] lr: 1.000e-04, eta: 17:52:16, time: 0.808, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0327, loss_rpn_bbox: 0.0626, loss_cls: 0.2624, acc: 91.1001, loss_bbox: 0.3078, loss_mask: 0.3001, loss: 0.9658 2024-05-30 23:57:43,730 - mmdet - INFO - Epoch [2][1450/7330] lr: 1.000e-04, eta: 17:51:41, time: 0.824, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0601, loss_cls: 0.2489, acc: 91.4534, loss_bbox: 0.3038, loss_mask: 0.2948, loss: 0.9411 2024-05-30 23:58:29,001 - mmdet - INFO - Epoch [2][1500/7330] lr: 1.000e-04, eta: 17:51:42, time: 0.905, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0365, loss_rpn_bbox: 0.0628, loss_cls: 0.2561, acc: 91.4419, loss_bbox: 0.3056, loss_mask: 0.2903, loss: 0.9513 2024-05-30 23:59:09,042 - mmdet - INFO - Epoch [2][1550/7330] lr: 1.000e-04, eta: 17:50:56, time: 0.801, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0324, loss_rpn_bbox: 0.0583, loss_cls: 0.2495, acc: 91.4338, loss_bbox: 0.3025, loss_mask: 0.2901, loss: 0.9328 2024-05-30 23:59:49,269 - mmdet - INFO - Epoch [2][1600/7330] lr: 1.000e-04, eta: 17:50:12, time: 0.805, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0587, loss_cls: 0.2457, acc: 91.5774, loss_bbox: 0.2968, loss_mask: 0.2877, loss: 0.9227 2024-05-31 00:00:29,634 - mmdet - INFO - Epoch [2][1650/7330] lr: 1.000e-04, eta: 17:49:29, time: 0.807, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0337, loss_rpn_bbox: 0.0608, loss_cls: 0.2515, acc: 91.5459, loss_bbox: 0.3032, loss_mask: 0.2873, loss: 0.9365 2024-05-31 00:01:10,334 - mmdet - INFO - Epoch [2][1700/7330] lr: 1.000e-04, eta: 17:48:49, time: 0.814, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0333, loss_rpn_bbox: 0.0614, loss_cls: 0.2590, acc: 91.2473, loss_bbox: 0.3094, loss_mask: 0.2941, loss: 0.9573 2024-05-31 00:01:50,293 - mmdet - INFO - Epoch [2][1750/7330] lr: 1.000e-04, eta: 17:48:02, time: 0.799, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0590, loss_cls: 0.2357, acc: 92.1233, loss_bbox: 0.2834, loss_mask: 0.2865, loss: 0.8982 2024-05-31 00:02:30,964 - mmdet - INFO - Epoch [2][1800/7330] lr: 1.000e-04, eta: 17:47:22, time: 0.813, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0330, loss_rpn_bbox: 0.0602, loss_cls: 0.2452, acc: 91.5190, loss_bbox: 0.2982, loss_mask: 0.2869, loss: 0.9234 2024-05-31 00:03:12,008 - mmdet - INFO - Epoch [2][1850/7330] lr: 1.000e-04, eta: 17:46:45, time: 0.821, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0319, loss_rpn_bbox: 0.0590, loss_cls: 0.2411, acc: 91.5894, loss_bbox: 0.2978, loss_mask: 0.2843, loss: 0.9142 2024-05-31 00:03:52,939 - mmdet - INFO - Epoch [2][1900/7330] lr: 1.000e-04, eta: 17:46:07, time: 0.819, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0366, loss_rpn_bbox: 0.0623, loss_cls: 0.2508, acc: 91.4429, loss_bbox: 0.3032, loss_mask: 0.2947, loss: 0.9476 2024-05-31 00:04:33,026 - mmdet - INFO - Epoch [2][1950/7330] lr: 1.000e-04, eta: 17:45:22, time: 0.802, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0563, loss_cls: 0.2470, acc: 91.5918, loss_bbox: 0.2998, loss_mask: 0.2916, loss: 0.9271 2024-05-31 00:05:13,113 - mmdet - INFO - Epoch [2][2000/7330] lr: 1.000e-04, eta: 17:44:37, time: 0.802, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0343, loss_rpn_bbox: 0.0625, loss_cls: 0.2382, acc: 91.9353, loss_bbox: 0.2910, loss_mask: 0.2871, loss: 0.9131 2024-05-31 00:05:54,105 - mmdet - INFO - Epoch [2][2050/7330] lr: 1.000e-04, eta: 17:43:59, time: 0.820, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0340, loss_rpn_bbox: 0.0610, loss_cls: 0.2548, acc: 91.2783, loss_bbox: 0.3070, loss_mask: 0.2949, loss: 0.9517 2024-05-31 00:06:34,398 - mmdet - INFO - Epoch [2][2100/7330] lr: 1.000e-04, eta: 17:43:16, time: 0.806, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0608, loss_cls: 0.2525, acc: 91.3433, loss_bbox: 0.3085, loss_mask: 0.2931, loss: 0.9485 2024-05-31 00:07:28,383 - mmdet - INFO - Epoch [2][2150/7330] lr: 1.000e-04, eta: 17:44:26, time: 1.080, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0580, loss_cls: 0.2595, acc: 91.1919, loss_bbox: 0.3081, loss_mask: 0.2948, loss: 0.9526 2024-05-31 00:08:12,129 - mmdet - INFO - Epoch [2][2200/7330] lr: 1.000e-04, eta: 17:44:11, time: 0.875, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0609, loss_cls: 0.2546, acc: 91.2891, loss_bbox: 0.3026, loss_mask: 0.2941, loss: 0.9467 2024-05-31 00:08:52,601 - mmdet - INFO - Epoch [2][2250/7330] lr: 1.000e-04, eta: 17:43:28, time: 0.809, data_time: 0.040, memory: 18874, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0586, loss_cls: 0.2447, acc: 91.7605, loss_bbox: 0.2901, loss_mask: 0.2870, loss: 0.9151 2024-05-31 00:09:33,033 - mmdet - INFO - Epoch [2][2300/7330] lr: 1.000e-04, eta: 17:42:45, time: 0.809, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0587, loss_cls: 0.2519, acc: 91.4089, loss_bbox: 0.3003, loss_mask: 0.2831, loss: 0.9243 2024-05-31 00:10:12,710 - mmdet - INFO - Epoch [2][2350/7330] lr: 1.000e-04, eta: 17:41:56, time: 0.794, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0307, loss_rpn_bbox: 0.0553, loss_cls: 0.2486, acc: 91.4888, loss_bbox: 0.2927, loss_mask: 0.2819, loss: 0.9092 2024-05-31 00:10:53,181 - mmdet - INFO - Epoch [2][2400/7330] lr: 1.000e-04, eta: 17:41:13, time: 0.809, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0583, loss_cls: 0.2492, acc: 91.2878, loss_bbox: 0.2991, loss_mask: 0.2838, loss: 0.9205 2024-05-31 00:11:34,011 - mmdet - INFO - Epoch [2][2450/7330] lr: 1.000e-04, eta: 17:40:34, time: 0.817, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0576, loss_cls: 0.2447, acc: 91.6792, loss_bbox: 0.2939, loss_mask: 0.2878, loss: 0.9165 2024-05-31 00:12:13,937 - mmdet - INFO - Epoch [2][2500/7330] lr: 1.000e-04, eta: 17:39:47, time: 0.798, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0333, loss_rpn_bbox: 0.0569, loss_cls: 0.2375, acc: 91.9219, loss_bbox: 0.2833, loss_mask: 0.2853, loss: 0.8963 2024-05-31 00:12:58,876 - mmdet - INFO - Epoch [2][2550/7330] lr: 1.000e-04, eta: 17:39:40, time: 0.899, data_time: 0.034, memory: 18874, loss_rpn_cls: 0.0324, loss_rpn_bbox: 0.0564, loss_cls: 0.2371, acc: 91.8989, loss_bbox: 0.2847, loss_mask: 0.2902, loss: 0.9008 2024-05-31 00:13:39,366 - mmdet - INFO - Epoch [2][2600/7330] lr: 1.000e-04, eta: 17:38:57, time: 0.810, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0590, loss_cls: 0.2537, acc: 91.3191, loss_bbox: 0.3065, loss_mask: 0.2859, loss: 0.9360 2024-05-31 00:14:20,337 - mmdet - INFO - Epoch [2][2650/7330] lr: 1.000e-04, eta: 17:38:19, time: 0.819, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0359, loss_rpn_bbox: 0.0635, loss_cls: 0.2610, acc: 91.0613, loss_bbox: 0.3163, loss_mask: 0.2926, loss: 0.9693 2024-05-31 00:15:01,718 - mmdet - INFO - Epoch [2][2700/7330] lr: 1.000e-04, eta: 17:37:43, time: 0.828, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0331, loss_rpn_bbox: 0.0609, loss_cls: 0.2461, acc: 91.6404, loss_bbox: 0.2951, loss_mask: 0.2792, loss: 0.9144 2024-05-31 00:15:42,655 - mmdet - INFO - Epoch [2][2750/7330] lr: 1.000e-04, eta: 17:37:04, time: 0.819, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0603, loss_cls: 0.2436, acc: 91.7900, loss_bbox: 0.2888, loss_mask: 0.2875, loss: 0.9127 2024-05-31 00:16:23,507 - mmdet - INFO - Epoch [2][2800/7330] lr: 1.000e-04, eta: 17:36:24, time: 0.817, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0598, loss_cls: 0.2461, acc: 91.6489, loss_bbox: 0.2942, loss_mask: 0.2875, loss: 0.9186 2024-05-31 00:17:04,263 - mmdet - INFO - Epoch [2][2850/7330] lr: 1.000e-04, eta: 17:35:44, time: 0.815, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0329, loss_rpn_bbox: 0.0577, loss_cls: 0.2366, acc: 91.7910, loss_bbox: 0.2889, loss_mask: 0.2885, loss: 0.9048 2024-05-31 00:17:44,578 - mmdet - INFO - Epoch [2][2900/7330] lr: 1.000e-04, eta: 17:35:00, time: 0.806, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0339, loss_rpn_bbox: 0.0576, loss_cls: 0.2443, acc: 91.6506, loss_bbox: 0.2885, loss_mask: 0.2901, loss: 0.9143 2024-05-31 00:18:25,141 - mmdet - INFO - Epoch [2][2950/7330] lr: 1.000e-04, eta: 17:34:18, time: 0.811, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0562, loss_cls: 0.2383, acc: 91.8362, loss_bbox: 0.2958, loss_mask: 0.2896, loss: 0.9107 2024-05-31 00:19:04,723 - mmdet - INFO - Epoch [2][3000/7330] lr: 1.000e-04, eta: 17:33:29, time: 0.792, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0531, loss_cls: 0.2382, acc: 91.8882, loss_bbox: 0.2846, loss_mask: 0.2898, loss: 0.8976 2024-05-31 00:19:45,237 - mmdet - INFO - Epoch [2][3050/7330] lr: 1.000e-04, eta: 17:32:47, time: 0.810, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0592, loss_cls: 0.2437, acc: 91.7180, loss_bbox: 0.2887, loss_mask: 0.2870, loss: 0.9124 2024-05-31 00:20:26,658 - mmdet - INFO - Epoch [2][3100/7330] lr: 1.000e-04, eta: 17:32:11, time: 0.828, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0575, loss_cls: 0.2404, acc: 91.7678, loss_bbox: 0.2878, loss_mask: 0.2867, loss: 0.9035 2024-05-31 00:21:06,927 - mmdet - INFO - Epoch [2][3150/7330] lr: 1.000e-04, eta: 17:31:27, time: 0.805, data_time: 0.037, memory: 18874, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0583, loss_cls: 0.2359, acc: 91.7512, loss_bbox: 0.2884, loss_mask: 0.2804, loss: 0.8938 2024-05-31 00:21:52,281 - mmdet - INFO - Epoch [2][3200/7330] lr: 1.000e-04, eta: 17:31:21, time: 0.907, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0566, loss_cls: 0.2336, acc: 92.0481, loss_bbox: 0.2848, loss_mask: 0.2811, loss: 0.8872 2024-05-31 00:22:43,797 - mmdet - INFO - Epoch [2][3250/7330] lr: 1.000e-04, eta: 17:31:59, time: 1.030, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0569, loss_cls: 0.2403, acc: 91.7185, loss_bbox: 0.2862, loss_mask: 0.2817, loss: 0.8970 2024-05-31 00:23:24,186 - mmdet - INFO - Epoch [2][3300/7330] lr: 1.000e-04, eta: 17:31:15, time: 0.808, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0326, loss_rpn_bbox: 0.0580, loss_cls: 0.2434, acc: 91.7131, loss_bbox: 0.2959, loss_mask: 0.2886, loss: 0.9185 2024-05-31 00:24:04,671 - mmdet - INFO - Epoch [2][3350/7330] lr: 1.000e-04, eta: 17:30:32, time: 0.810, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0343, loss_rpn_bbox: 0.0580, loss_cls: 0.2486, acc: 91.4312, loss_bbox: 0.2954, loss_mask: 0.2937, loss: 0.9301 2024-05-31 00:24:44,660 - mmdet - INFO - Epoch [2][3400/7330] lr: 1.000e-04, eta: 17:29:46, time: 0.800, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0284, loss_rpn_bbox: 0.0557, loss_cls: 0.2382, acc: 91.9553, loss_bbox: 0.2843, loss_mask: 0.2797, loss: 0.8862 2024-05-31 00:25:25,034 - mmdet - INFO - Epoch [2][3450/7330] lr: 1.000e-04, eta: 17:29:02, time: 0.808, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0331, loss_rpn_bbox: 0.0564, loss_cls: 0.2396, acc: 91.6296, loss_bbox: 0.2936, loss_mask: 0.2871, loss: 0.9099 2024-05-31 00:26:05,198 - mmdet - INFO - Epoch [2][3500/7330] lr: 1.000e-04, eta: 17:28:17, time: 0.803, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0540, loss_cls: 0.2442, acc: 91.6987, loss_bbox: 0.2890, loss_mask: 0.2839, loss: 0.9013 2024-05-31 00:26:45,826 - mmdet - INFO - Epoch [2][3550/7330] lr: 1.000e-04, eta: 17:27:35, time: 0.813, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0294, loss_rpn_bbox: 0.0553, loss_cls: 0.2464, acc: 91.5352, loss_bbox: 0.2998, loss_mask: 0.2814, loss: 0.9123 2024-05-31 00:27:26,270 - mmdet - INFO - Epoch [2][3600/7330] lr: 1.000e-04, eta: 17:26:52, time: 0.809, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0583, loss_cls: 0.2450, acc: 91.6033, loss_bbox: 0.2983, loss_mask: 0.2870, loss: 0.9212 2024-05-31 00:28:11,529 - mmdet - INFO - Epoch [2][3650/7330] lr: 1.000e-04, eta: 17:26:43, time: 0.905, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0568, loss_cls: 0.2391, acc: 91.7581, loss_bbox: 0.2937, loss_mask: 0.2794, loss: 0.9008 2024-05-31 00:28:52,250 - mmdet - INFO - Epoch [2][3700/7330] lr: 1.000e-04, eta: 17:26:01, time: 0.814, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0540, loss_cls: 0.2471, acc: 91.5852, loss_bbox: 0.2960, loss_mask: 0.2812, loss: 0.9086 2024-05-31 00:29:32,699 - mmdet - INFO - Epoch [2][3750/7330] lr: 1.000e-04, eta: 17:25:18, time: 0.809, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0586, loss_cls: 0.2464, acc: 91.6030, loss_bbox: 0.2964, loss_mask: 0.2921, loss: 0.9252 2024-05-31 00:30:13,069 - mmdet - INFO - Epoch [2][3800/7330] lr: 1.000e-04, eta: 17:24:35, time: 0.807, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0567, loss_cls: 0.2372, acc: 91.8591, loss_bbox: 0.2899, loss_mask: 0.2751, loss: 0.8889 2024-05-31 00:30:53,624 - mmdet - INFO - Epoch [2][3850/7330] lr: 1.000e-04, eta: 17:23:52, time: 0.811, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0326, loss_rpn_bbox: 0.0573, loss_cls: 0.2403, acc: 91.7644, loss_bbox: 0.2941, loss_mask: 0.2843, loss: 0.9085 2024-05-31 00:31:33,930 - mmdet - INFO - Epoch [2][3900/7330] lr: 1.000e-04, eta: 17:23:08, time: 0.806, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0550, loss_cls: 0.2432, acc: 91.5967, loss_bbox: 0.2910, loss_mask: 0.2814, loss: 0.9015 2024-05-31 00:32:14,823 - mmdet - INFO - Epoch [2][3950/7330] lr: 1.000e-04, eta: 17:22:28, time: 0.818, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0582, loss_cls: 0.2398, acc: 91.6365, loss_bbox: 0.2992, loss_mask: 0.2847, loss: 0.9156 2024-05-31 00:32:55,212 - mmdet - INFO - Epoch [2][4000/7330] lr: 1.000e-04, eta: 17:21:45, time: 0.808, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0313, loss_rpn_bbox: 0.0559, loss_cls: 0.2458, acc: 91.5569, loss_bbox: 0.2986, loss_mask: 0.2842, loss: 0.9158 2024-05-31 00:33:35,520 - mmdet - INFO - Epoch [2][4050/7330] lr: 1.000e-04, eta: 17:21:01, time: 0.806, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0565, loss_cls: 0.2408, acc: 91.6567, loss_bbox: 0.2942, loss_mask: 0.2847, loss: 0.9081 2024-05-31 00:34:16,053 - mmdet - INFO - Epoch [2][4100/7330] lr: 1.000e-04, eta: 17:20:18, time: 0.811, data_time: 0.040, memory: 18874, loss_rpn_cls: 0.0289, loss_rpn_bbox: 0.0545, loss_cls: 0.2316, acc: 91.9666, loss_bbox: 0.2816, loss_mask: 0.2825, loss: 0.8791 2024-05-31 00:34:56,696 - mmdet - INFO - Epoch [2][4150/7330] lr: 1.000e-04, eta: 17:19:36, time: 0.813, data_time: 0.039, memory: 18874, loss_rpn_cls: 0.0334, loss_rpn_bbox: 0.0606, loss_cls: 0.2415, acc: 91.7217, loss_bbox: 0.2944, loss_mask: 0.2844, loss: 0.9144 2024-05-31 00:35:36,752 - mmdet - INFO - Epoch [2][4200/7330] lr: 1.000e-04, eta: 17:18:51, time: 0.801, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0568, loss_cls: 0.2299, acc: 92.1372, loss_bbox: 0.2793, loss_mask: 0.2796, loss: 0.8764 2024-05-31 00:36:17,009 - mmdet - INFO - Epoch [2][4250/7330] lr: 1.000e-04, eta: 17:18:07, time: 0.805, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0559, loss_cls: 0.2276, acc: 92.0476, loss_bbox: 0.2764, loss_mask: 0.2802, loss: 0.8718 2024-05-31 00:37:09,525 - mmdet - INFO - Epoch [2][4300/7330] lr: 1.000e-04, eta: 17:18:43, time: 1.050, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0332, loss_rpn_bbox: 0.0601, loss_cls: 0.2287, acc: 92.0872, loss_bbox: 0.2816, loss_mask: 0.2821, loss: 0.8858 2024-05-31 00:37:52,534 - mmdet - INFO - Epoch [2][4350/7330] lr: 1.000e-04, eta: 17:18:16, time: 0.860, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0332, loss_rpn_bbox: 0.0592, loss_cls: 0.2512, acc: 91.3706, loss_bbox: 0.3086, loss_mask: 0.2914, loss: 0.9437 2024-05-31 00:38:32,462 - mmdet - INFO - Epoch [2][4400/7330] lr: 1.000e-04, eta: 17:17:30, time: 0.799, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0580, loss_cls: 0.2362, acc: 91.8206, loss_bbox: 0.2816, loss_mask: 0.2786, loss: 0.8845 2024-05-31 00:39:12,377 - mmdet - INFO - Epoch [2][4450/7330] lr: 1.000e-04, eta: 17:16:43, time: 0.798, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0532, loss_cls: 0.2209, acc: 92.2874, loss_bbox: 0.2739, loss_mask: 0.2801, loss: 0.8569 2024-05-31 00:39:52,716 - mmdet - INFO - Epoch [2][4500/7330] lr: 1.000e-04, eta: 17:15:59, time: 0.807, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0581, loss_cls: 0.2388, acc: 91.5916, loss_bbox: 0.2932, loss_mask: 0.2783, loss: 0.9001 2024-05-31 00:40:32,281 - mmdet - INFO - Epoch [2][4550/7330] lr: 1.000e-04, eta: 17:15:10, time: 0.791, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0324, loss_rpn_bbox: 0.0548, loss_cls: 0.2325, acc: 92.0718, loss_bbox: 0.2804, loss_mask: 0.2834, loss: 0.8835 2024-05-31 00:41:13,126 - mmdet - INFO - Epoch [2][4600/7330] lr: 1.000e-04, eta: 17:14:29, time: 0.817, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0553, loss_cls: 0.2408, acc: 91.6958, loss_bbox: 0.2939, loss_mask: 0.2842, loss: 0.9060 2024-05-31 00:41:53,043 - mmdet - INFO - Epoch [2][4650/7330] lr: 1.000e-04, eta: 17:13:43, time: 0.798, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0506, loss_cls: 0.2266, acc: 92.2715, loss_bbox: 0.2741, loss_mask: 0.2766, loss: 0.8551 2024-05-31 00:42:38,327 - mmdet - INFO - Epoch [2][4700/7330] lr: 1.000e-04, eta: 17:13:30, time: 0.906, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0542, loss_cls: 0.2227, acc: 92.3501, loss_bbox: 0.2707, loss_mask: 0.2764, loss: 0.8530 2024-05-31 00:43:18,799 - mmdet - INFO - Epoch [2][4750/7330] lr: 1.000e-04, eta: 17:12:47, time: 0.809, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0557, loss_cls: 0.2346, acc: 91.7246, loss_bbox: 0.2889, loss_mask: 0.2726, loss: 0.8828 2024-05-31 00:43:58,825 - mmdet - INFO - Epoch [2][4800/7330] lr: 1.000e-04, eta: 17:12:01, time: 0.801, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0573, loss_cls: 0.2354, acc: 91.9436, loss_bbox: 0.2874, loss_mask: 0.2764, loss: 0.8865 2024-05-31 00:44:39,236 - mmdet - INFO - Epoch [2][4850/7330] lr: 1.000e-04, eta: 17:11:18, time: 0.808, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0581, loss_cls: 0.2408, acc: 91.7241, loss_bbox: 0.2875, loss_mask: 0.2815, loss: 0.8998 2024-05-31 00:45:20,337 - mmdet - INFO - Epoch [2][4900/7330] lr: 1.000e-04, eta: 17:10:39, time: 0.822, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0559, loss_cls: 0.2471, acc: 91.4443, loss_bbox: 0.2930, loss_mask: 0.2818, loss: 0.9079 2024-05-31 00:46:00,588 - mmdet - INFO - Epoch [2][4950/7330] lr: 1.000e-04, eta: 17:09:54, time: 0.805, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0548, loss_cls: 0.2347, acc: 91.8264, loss_bbox: 0.2879, loss_mask: 0.2760, loss: 0.8856 2024-05-31 00:46:40,770 - mmdet - INFO - Epoch [2][5000/7330] lr: 1.000e-04, eta: 17:09:10, time: 0.804, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0316, loss_rpn_bbox: 0.0581, loss_cls: 0.2468, acc: 91.5847, loss_bbox: 0.3004, loss_mask: 0.2838, loss: 0.9207 2024-05-31 00:47:21,066 - mmdet - INFO - Epoch [2][5050/7330] lr: 1.000e-04, eta: 17:08:25, time: 0.806, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0546, loss_cls: 0.2228, acc: 92.3245, loss_bbox: 0.2723, loss_mask: 0.2756, loss: 0.8532 2024-05-31 00:48:01,315 - mmdet - INFO - Epoch [2][5100/7330] lr: 1.000e-04, eta: 17:07:41, time: 0.805, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0552, loss_cls: 0.2419, acc: 91.6914, loss_bbox: 0.2922, loss_mask: 0.2827, loss: 0.9005 2024-05-31 00:48:42,487 - mmdet - INFO - Epoch [2][5150/7330] lr: 1.000e-04, eta: 17:07:02, time: 0.823, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0327, loss_rpn_bbox: 0.0616, loss_cls: 0.2464, acc: 91.3740, loss_bbox: 0.3061, loss_mask: 0.2821, loss: 0.9288 2024-05-31 00:49:22,874 - mmdet - INFO - Epoch [2][5200/7330] lr: 1.000e-04, eta: 17:06:19, time: 0.808, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0552, loss_cls: 0.2417, acc: 91.6558, loss_bbox: 0.2865, loss_mask: 0.2830, loss: 0.8947 2024-05-31 00:50:03,510 - mmdet - INFO - Epoch [2][5250/7330] lr: 1.000e-04, eta: 17:05:37, time: 0.813, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0558, loss_cls: 0.2362, acc: 91.9495, loss_bbox: 0.2841, loss_mask: 0.2786, loss: 0.8820 2024-05-31 00:50:43,823 - mmdet - INFO - Epoch [2][5300/7330] lr: 1.000e-04, eta: 17:04:53, time: 0.806, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0324, loss_rpn_bbox: 0.0573, loss_cls: 0.2477, acc: 91.5637, loss_bbox: 0.2932, loss_mask: 0.2841, loss: 0.9146 2024-05-31 00:51:31,251 - mmdet - INFO - Epoch [2][5350/7330] lr: 1.000e-04, eta: 17:04:52, time: 0.949, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0326, loss_rpn_bbox: 0.0577, loss_cls: 0.2384, acc: 92.0454, loss_bbox: 0.2802, loss_mask: 0.2844, loss: 0.8934 2024-05-31 00:52:18,516 - mmdet - INFO - Epoch [2][5400/7330] lr: 1.000e-04, eta: 17:04:49, time: 0.945, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0548, loss_cls: 0.2321, acc: 92.0825, loss_bbox: 0.2777, loss_mask: 0.2772, loss: 0.8701 2024-05-31 00:52:58,694 - mmdet - INFO - Epoch [2][5450/7330] lr: 1.000e-04, eta: 17:04:04, time: 0.804, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0549, loss_cls: 0.2273, acc: 92.1909, loss_bbox: 0.2752, loss_mask: 0.2741, loss: 0.8626 2024-05-31 00:53:39,143 - mmdet - INFO - Epoch [2][5500/7330] lr: 1.000e-04, eta: 17:03:21, time: 0.809, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0544, loss_cls: 0.2359, acc: 91.9050, loss_bbox: 0.2833, loss_mask: 0.2787, loss: 0.8813 2024-05-31 00:54:20,217 - mmdet - INFO - Epoch [2][5550/7330] lr: 1.000e-04, eta: 17:02:41, time: 0.821, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0310, loss_rpn_bbox: 0.0572, loss_cls: 0.2368, acc: 91.7905, loss_bbox: 0.2898, loss_mask: 0.2846, loss: 0.8994 2024-05-31 00:55:00,543 - mmdet - INFO - Epoch [2][5600/7330] lr: 1.000e-04, eta: 17:01:57, time: 0.807, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0578, loss_cls: 0.2375, acc: 91.7971, loss_bbox: 0.2896, loss_mask: 0.2831, loss: 0.9003 2024-05-31 00:55:40,990 - mmdet - INFO - Epoch [2][5650/7330] lr: 1.000e-04, eta: 17:01:14, time: 0.809, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0537, loss_cls: 0.2344, acc: 92.0439, loss_bbox: 0.2800, loss_mask: 0.2755, loss: 0.8730 2024-05-31 00:56:21,648 - mmdet - INFO - Epoch [2][5700/7330] lr: 1.000e-04, eta: 17:00:32, time: 0.813, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0319, loss_rpn_bbox: 0.0574, loss_cls: 0.2364, acc: 91.8711, loss_bbox: 0.2851, loss_mask: 0.2840, loss: 0.8948 2024-05-31 00:57:06,370 - mmdet - INFO - Epoch [2][5750/7330] lr: 1.000e-04, eta: 17:00:13, time: 0.894, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0535, loss_cls: 0.2311, acc: 91.9971, loss_bbox: 0.2778, loss_mask: 0.2742, loss: 0.8635 2024-05-31 00:57:46,692 - mmdet - INFO - Epoch [2][5800/7330] lr: 1.000e-04, eta: 16:59:29, time: 0.806, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0525, loss_cls: 0.2387, acc: 91.7532, loss_bbox: 0.2852, loss_mask: 0.2789, loss: 0.8821 2024-05-31 00:58:27,159 - mmdet - INFO - Epoch [2][5850/7330] lr: 1.000e-04, eta: 16:58:46, time: 0.809, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0573, loss_cls: 0.2419, acc: 91.5557, loss_bbox: 0.2930, loss_mask: 0.2737, loss: 0.8937 2024-05-31 00:59:07,190 - mmdet - INFO - Epoch [2][5900/7330] lr: 1.000e-04, eta: 16:58:00, time: 0.801, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0547, loss_cls: 0.2284, acc: 92.1055, loss_bbox: 0.2747, loss_mask: 0.2719, loss: 0.8577 2024-05-31 00:59:47,026 - mmdet - INFO - Epoch [2][5950/7330] lr: 1.000e-04, eta: 16:57:13, time: 0.797, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0299, loss_rpn_bbox: 0.0539, loss_cls: 0.2363, acc: 92.0625, loss_bbox: 0.2754, loss_mask: 0.2795, loss: 0.8749 2024-05-31 01:00:27,485 - mmdet - INFO - Epoch [2][6000/7330] lr: 1.000e-04, eta: 16:56:30, time: 0.809, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0547, loss_cls: 0.2320, acc: 92.0259, loss_bbox: 0.2768, loss_mask: 0.2796, loss: 0.8723 2024-05-31 01:01:08,537 - mmdet - INFO - Epoch [2][6050/7330] lr: 1.000e-04, eta: 16:55:51, time: 0.821, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0310, loss_rpn_bbox: 0.0569, loss_cls: 0.2410, acc: 91.6433, loss_bbox: 0.2910, loss_mask: 0.2799, loss: 0.8999 2024-05-31 01:01:49,031 - mmdet - INFO - Epoch [2][6100/7330] lr: 1.000e-04, eta: 16:55:08, time: 0.810, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0570, loss_cls: 0.2337, acc: 91.6384, loss_bbox: 0.2902, loss_mask: 0.2784, loss: 0.8900 2024-05-31 01:02:29,288 - mmdet - INFO - Epoch [2][6150/7330] lr: 1.000e-04, eta: 16:54:23, time: 0.805, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0312, loss_rpn_bbox: 0.0548, loss_cls: 0.2395, acc: 91.6572, loss_bbox: 0.2935, loss_mask: 0.2799, loss: 0.8988 2024-05-31 01:03:09,898 - mmdet - INFO - Epoch [2][6200/7330] lr: 1.000e-04, eta: 16:53:41, time: 0.812, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0536, loss_cls: 0.2320, acc: 91.9963, loss_bbox: 0.2787, loss_mask: 0.2789, loss: 0.8727 2024-05-31 01:03:50,391 - mmdet - INFO - Epoch [2][6250/7330] lr: 1.000e-04, eta: 16:52:58, time: 0.810, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0539, loss_cls: 0.2402, acc: 91.6726, loss_bbox: 0.2914, loss_mask: 0.2752, loss: 0.8904 2024-05-31 01:04:30,483 - mmdet - INFO - Epoch [2][6300/7330] lr: 1.000e-04, eta: 16:52:13, time: 0.802, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0532, loss_cls: 0.2236, acc: 92.2356, loss_bbox: 0.2724, loss_mask: 0.2748, loss: 0.8512 2024-05-31 01:05:10,952 - mmdet - INFO - Epoch [2][6350/7330] lr: 1.000e-04, eta: 16:51:30, time: 0.809, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0289, loss_rpn_bbox: 0.0538, loss_cls: 0.2312, acc: 92.0535, loss_bbox: 0.2786, loss_mask: 0.2753, loss: 0.8678 2024-05-31 01:05:51,878 - mmdet - INFO - Epoch [2][6400/7330] lr: 1.000e-04, eta: 16:50:50, time: 0.819, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0337, loss_rpn_bbox: 0.0583, loss_cls: 0.2327, acc: 91.8933, loss_bbox: 0.2822, loss_mask: 0.2769, loss: 0.8837 2024-05-31 01:06:47,785 - mmdet - INFO - Epoch [2][6450/7330] lr: 1.000e-04, eta: 16:51:30, time: 1.118, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0292, loss_rpn_bbox: 0.0572, loss_cls: 0.2328, acc: 91.8628, loss_bbox: 0.2838, loss_mask: 0.2751, loss: 0.8780 2024-05-31 01:07:28,403 - mmdet - INFO - Epoch [2][6500/7330] lr: 1.000e-04, eta: 16:50:48, time: 0.812, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0544, loss_cls: 0.2359, acc: 91.9741, loss_bbox: 0.2855, loss_mask: 0.2738, loss: 0.8795 2024-05-31 01:08:08,837 - mmdet - INFO - Epoch [2][6550/7330] lr: 1.000e-04, eta: 16:50:04, time: 0.809, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0284, loss_rpn_bbox: 0.0539, loss_cls: 0.2396, acc: 91.8145, loss_bbox: 0.2836, loss_mask: 0.2740, loss: 0.8796 2024-05-31 01:08:48,873 - mmdet - INFO - Epoch [2][6600/7330] lr: 1.000e-04, eta: 16:49:19, time: 0.801, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0525, loss_cls: 0.2225, acc: 92.1628, loss_bbox: 0.2743, loss_mask: 0.2748, loss: 0.8523 2024-05-31 01:09:29,075 - mmdet - INFO - Epoch [2][6650/7330] lr: 1.000e-04, eta: 16:48:34, time: 0.804, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0538, loss_cls: 0.2280, acc: 91.9299, loss_bbox: 0.2799, loss_mask: 0.2751, loss: 0.8666 2024-05-31 01:10:09,489 - mmdet - INFO - Epoch [2][6700/7330] lr: 1.000e-04, eta: 16:47:51, time: 0.808, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0299, loss_rpn_bbox: 0.0549, loss_cls: 0.2317, acc: 92.0234, loss_bbox: 0.2778, loss_mask: 0.2766, loss: 0.8709 2024-05-31 01:10:49,022 - mmdet - INFO - Epoch [2][6750/7330] lr: 1.000e-04, eta: 16:47:03, time: 0.791, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0503, loss_cls: 0.2110, acc: 92.6868, loss_bbox: 0.2613, loss_mask: 0.2712, loss: 0.8216 2024-05-31 01:11:29,349 - mmdet - INFO - Epoch [2][6800/7330] lr: 1.000e-04, eta: 16:46:19, time: 0.807, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0527, loss_cls: 0.2291, acc: 92.0676, loss_bbox: 0.2747, loss_mask: 0.2742, loss: 0.8603 2024-05-31 01:12:12,931 - mmdet - INFO - Epoch [2][6850/7330] lr: 1.000e-04, eta: 16:45:52, time: 0.872, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0557, loss_cls: 0.2360, acc: 91.7754, loss_bbox: 0.2877, loss_mask: 0.2756, loss: 0.8834 2024-05-31 01:12:53,239 - mmdet - INFO - Epoch [2][6900/7330] lr: 1.000e-04, eta: 16:45:08, time: 0.806, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0555, loss_cls: 0.2398, acc: 91.6248, loss_bbox: 0.2924, loss_mask: 0.2750, loss: 0.8909 2024-05-31 01:13:33,162 - mmdet - INFO - Epoch [2][6950/7330] lr: 1.000e-04, eta: 16:44:22, time: 0.799, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0563, loss_cls: 0.2305, acc: 91.9707, loss_bbox: 0.2814, loss_mask: 0.2747, loss: 0.8717 2024-05-31 01:14:12,955 - mmdet - INFO - Epoch [2][7000/7330] lr: 1.000e-04, eta: 16:43:35, time: 0.795, data_time: 0.040, memory: 18874, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0526, loss_cls: 0.2238, acc: 92.1345, loss_bbox: 0.2772, loss_mask: 0.2654, loss: 0.8466 2024-05-31 01:14:53,858 - mmdet - INFO - Epoch [2][7050/7330] lr: 1.000e-04, eta: 16:42:54, time: 0.819, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0575, loss_cls: 0.2342, acc: 91.6917, loss_bbox: 0.2897, loss_mask: 0.2775, loss: 0.8868 2024-05-31 01:15:33,805 - mmdet - INFO - Epoch [2][7100/7330] lr: 1.000e-04, eta: 16:42:09, time: 0.799, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0498, loss_cls: 0.2209, acc: 92.3005, loss_bbox: 0.2721, loss_mask: 0.2633, loss: 0.8342 2024-05-31 01:16:13,932 - mmdet - INFO - Epoch [2][7150/7330] lr: 1.000e-04, eta: 16:41:24, time: 0.802, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0310, loss_rpn_bbox: 0.0541, loss_cls: 0.2347, acc: 91.7622, loss_bbox: 0.2871, loss_mask: 0.2729, loss: 0.8799 2024-05-31 01:16:53,970 - mmdet - INFO - Epoch [2][7200/7330] lr: 1.000e-04, eta: 16:40:39, time: 0.801, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0521, loss_cls: 0.2220, acc: 92.3416, loss_bbox: 0.2677, loss_mask: 0.2687, loss: 0.8379 2024-05-31 01:17:34,366 - mmdet - INFO - Epoch [2][7250/7330] lr: 1.000e-04, eta: 16:39:55, time: 0.808, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0517, loss_cls: 0.2250, acc: 92.2190, loss_bbox: 0.2709, loss_mask: 0.2754, loss: 0.8520 2024-05-31 01:18:14,255 - mmdet - INFO - Epoch [2][7300/7330] lr: 1.000e-04, eta: 16:39:10, time: 0.798, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0313, loss_rpn_bbox: 0.0548, loss_cls: 0.2316, acc: 91.9934, loss_bbox: 0.2826, loss_mask: 0.2697, loss: 0.8699 2024-05-31 01:18:38,991 - mmdet - INFO - Saving checkpoint at 2 epochs 2024-05-31 01:20:28,319 - mmdet - INFO - Evaluating bbox... 2024-05-31 01:20:56,808 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.375 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.631 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.394 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.217 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.415 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.523 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.503 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.503 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.503 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.299 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.546 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.671 2024-05-31 01:20:56,808 - mmdet - INFO - Evaluating segm... 2024-05-31 01:21:24,648 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.356 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.595 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.374 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.150 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.391 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.572 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.477 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.477 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.477 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.256 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.524 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.679 2024-05-31 01:21:25,153 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 01:21:25,155 - mmdet - INFO - Epoch(val) [2][625] bbox_mAP: 0.3750, bbox_mAP_50: 0.6310, bbox_mAP_75: 0.3940, bbox_mAP_s: 0.2170, bbox_mAP_m: 0.4150, bbox_mAP_l: 0.5230, bbox_mAP_copypaste: 0.375 0.631 0.394 0.217 0.415 0.523, segm_mAP: 0.3560, segm_mAP_50: 0.5950, segm_mAP_75: 0.3740, segm_mAP_s: 0.1500, segm_mAP_m: 0.3910, segm_mAP_l: 0.5720, segm_mAP_copypaste: 0.356 0.595 0.374 0.150 0.391 0.572 2024-05-31 01:22:15,144 - mmdet - INFO - Epoch [3][50/7330] lr: 1.000e-04, eta: 16:36:47, time: 0.999, data_time: 0.141, memory: 18874, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0544, loss_cls: 0.2246, acc: 92.1418, loss_bbox: 0.2752, loss_mask: 0.2779, loss: 0.8602 2024-05-31 01:23:00,848 - mmdet - INFO - Epoch [3][100/7330] lr: 1.000e-04, eta: 16:36:31, time: 0.914, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0544, loss_cls: 0.2320, acc: 91.8730, loss_bbox: 0.2835, loss_mask: 0.2779, loss: 0.8771 2024-05-31 01:23:44,598 - mmdet - INFO - Epoch [3][150/7330] lr: 1.000e-04, eta: 16:36:04, time: 0.875, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0540, loss_cls: 0.2236, acc: 92.1062, loss_bbox: 0.2756, loss_mask: 0.2710, loss: 0.8516 2024-05-31 01:24:25,246 - mmdet - INFO - Epoch [3][200/7330] lr: 1.000e-04, eta: 16:35:22, time: 0.813, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0518, loss_cls: 0.2238, acc: 92.1416, loss_bbox: 0.2760, loss_mask: 0.2682, loss: 0.8456 2024-05-31 01:25:05,906 - mmdet - INFO - Epoch [3][250/7330] lr: 1.000e-04, eta: 16:34:41, time: 0.813, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0537, loss_cls: 0.2226, acc: 92.0750, loss_bbox: 0.2774, loss_mask: 0.2707, loss: 0.8509 2024-05-31 01:25:47,023 - mmdet - INFO - Epoch [3][300/7330] lr: 1.000e-04, eta: 16:34:01, time: 0.822, data_time: 0.073, memory: 18874, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0586, loss_cls: 0.2327, acc: 91.6655, loss_bbox: 0.2904, loss_mask: 0.2724, loss: 0.8826 2024-05-31 01:26:27,199 - mmdet - INFO - Epoch [3][350/7330] lr: 1.000e-04, eta: 16:33:17, time: 0.803, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0525, loss_cls: 0.2232, acc: 92.0312, loss_bbox: 0.2778, loss_mask: 0.2738, loss: 0.8524 2024-05-31 01:27:07,740 - mmdet - INFO - Epoch [3][400/7330] lr: 1.000e-04, eta: 16:32:35, time: 0.811, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0533, loss_cls: 0.2284, acc: 92.0471, loss_bbox: 0.2808, loss_mask: 0.2698, loss: 0.8611 2024-05-31 01:27:48,928 - mmdet - INFO - Epoch [3][450/7330] lr: 1.000e-04, eta: 16:31:55, time: 0.824, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0531, loss_cls: 0.2232, acc: 92.0684, loss_bbox: 0.2786, loss_mask: 0.2709, loss: 0.8511 2024-05-31 01:28:34,165 - mmdet - INFO - Epoch [3][500/7330] lr: 1.000e-04, eta: 16:31:36, time: 0.905, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0508, loss_cls: 0.2171, acc: 92.4360, loss_bbox: 0.2690, loss_mask: 0.2667, loss: 0.8298 2024-05-31 01:29:14,515 - mmdet - INFO - Epoch [3][550/7330] lr: 1.000e-04, eta: 16:30:52, time: 0.807, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0554, loss_cls: 0.2216, acc: 92.0701, loss_bbox: 0.2784, loss_mask: 0.2671, loss: 0.8510 2024-05-31 01:29:56,008 - mmdet - INFO - Epoch [3][600/7330] lr: 1.000e-04, eta: 16:30:14, time: 0.829, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0547, loss_cls: 0.2266, acc: 92.0334, loss_bbox: 0.2804, loss_mask: 0.2672, loss: 0.8551 2024-05-31 01:30:37,361 - mmdet - INFO - Epoch [3][650/7330] lr: 1.000e-04, eta: 16:29:36, time: 0.827, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0563, loss_cls: 0.2267, acc: 92.0073, loss_bbox: 0.2756, loss_mask: 0.2666, loss: 0.8530 2024-05-31 01:31:18,125 - mmdet - INFO - Epoch [3][700/7330] lr: 1.000e-04, eta: 16:28:55, time: 0.815, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0514, loss_cls: 0.2195, acc: 92.2163, loss_bbox: 0.2750, loss_mask: 0.2683, loss: 0.8409 2024-05-31 01:31:59,249 - mmdet - INFO - Epoch [3][750/7330] lr: 1.000e-04, eta: 16:28:15, time: 0.822, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0551, loss_cls: 0.2192, acc: 92.2146, loss_bbox: 0.2729, loss_mask: 0.2629, loss: 0.8389 2024-05-31 01:32:39,746 - mmdet - INFO - Epoch [3][800/7330] lr: 1.000e-04, eta: 16:27:32, time: 0.810, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0507, loss_cls: 0.2108, acc: 92.4995, loss_bbox: 0.2639, loss_mask: 0.2644, loss: 0.8165 2024-05-31 01:33:19,843 - mmdet - INFO - Epoch [3][850/7330] lr: 1.000e-04, eta: 16:26:48, time: 0.802, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0490, loss_cls: 0.2161, acc: 92.3955, loss_bbox: 0.2716, loss_mask: 0.2648, loss: 0.8274 2024-05-31 01:34:00,590 - mmdet - INFO - Epoch [3][900/7330] lr: 1.000e-04, eta: 16:26:06, time: 0.815, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0537, loss_cls: 0.2191, acc: 92.2124, loss_bbox: 0.2757, loss_mask: 0.2625, loss: 0.8372 2024-05-31 01:34:41,104 - mmdet - INFO - Epoch [3][950/7330] lr: 1.000e-04, eta: 16:25:24, time: 0.810, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0529, loss_cls: 0.2190, acc: 92.3462, loss_bbox: 0.2689, loss_mask: 0.2699, loss: 0.8365 2024-05-31 01:35:21,616 - mmdet - INFO - Epoch [3][1000/7330] lr: 1.000e-04, eta: 16:24:42, time: 0.810, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0536, loss_cls: 0.2192, acc: 92.2732, loss_bbox: 0.2681, loss_mask: 0.2654, loss: 0.8317 2024-05-31 01:36:02,364 - mmdet - INFO - Epoch [3][1050/7330] lr: 1.000e-04, eta: 16:24:00, time: 0.815, data_time: 0.076, memory: 18874, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0513, loss_cls: 0.2266, acc: 91.9246, loss_bbox: 0.2770, loss_mask: 0.2646, loss: 0.8462 2024-05-31 01:36:48,845 - mmdet - INFO - Epoch [3][1100/7330] lr: 1.000e-04, eta: 16:23:45, time: 0.930, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0539, loss_cls: 0.2157, acc: 92.4775, loss_bbox: 0.2636, loss_mask: 0.2635, loss: 0.8228 2024-05-31 01:37:33,861 - mmdet - INFO - Epoch [3][1150/7330] lr: 1.000e-04, eta: 16:23:23, time: 0.900, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0509, loss_cls: 0.2225, acc: 92.0857, loss_bbox: 0.2771, loss_mask: 0.2713, loss: 0.8480 2024-05-31 01:38:19,096 - mmdet - INFO - Epoch [3][1200/7330] lr: 1.000e-04, eta: 16:23:02, time: 0.905, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0488, loss_cls: 0.2119, acc: 92.5947, loss_bbox: 0.2609, loss_mask: 0.2625, loss: 0.8091 2024-05-31 01:39:00,053 - mmdet - INFO - Epoch [3][1250/7330] lr: 1.000e-04, eta: 16:22:21, time: 0.819, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0304, loss_rpn_bbox: 0.0594, loss_cls: 0.2333, acc: 91.7903, loss_bbox: 0.2920, loss_mask: 0.2786, loss: 0.8938 2024-05-31 01:39:40,865 - mmdet - INFO - Epoch [3][1300/7330] lr: 1.000e-04, eta: 16:21:40, time: 0.816, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0530, loss_cls: 0.2197, acc: 92.0818, loss_bbox: 0.2808, loss_mask: 0.2677, loss: 0.8480 2024-05-31 01:40:21,670 - mmdet - INFO - Epoch [3][1350/7330] lr: 1.000e-04, eta: 16:20:59, time: 0.816, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0531, loss_cls: 0.2183, acc: 92.2288, loss_bbox: 0.2711, loss_mask: 0.2657, loss: 0.8325 2024-05-31 01:41:02,755 - mmdet - INFO - Epoch [3][1400/7330] lr: 1.000e-04, eta: 16:20:19, time: 0.822, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0499, loss_cls: 0.2224, acc: 92.0525, loss_bbox: 0.2764, loss_mask: 0.2686, loss: 0.8423 2024-05-31 01:41:43,749 - mmdet - INFO - Epoch [3][1450/7330] lr: 1.000e-04, eta: 16:19:38, time: 0.820, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0522, loss_cls: 0.2237, acc: 91.9990, loss_bbox: 0.2818, loss_mask: 0.2686, loss: 0.8540 2024-05-31 01:42:24,673 - mmdet - INFO - Epoch [3][1500/7330] lr: 1.000e-04, eta: 16:18:57, time: 0.819, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0541, loss_cls: 0.2188, acc: 92.1716, loss_bbox: 0.2741, loss_mask: 0.2602, loss: 0.8370 2024-05-31 01:43:10,457 - mmdet - INFO - Epoch [3][1550/7330] lr: 1.000e-04, eta: 16:18:38, time: 0.916, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0552, loss_cls: 0.2202, acc: 92.1316, loss_bbox: 0.2758, loss_mask: 0.2632, loss: 0.8383 2024-05-31 01:43:51,507 - mmdet - INFO - Epoch [3][1600/7330] lr: 1.000e-04, eta: 16:17:58, time: 0.821, data_time: 0.084, memory: 18874, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0554, loss_cls: 0.2253, acc: 91.9209, loss_bbox: 0.2772, loss_mask: 0.2656, loss: 0.8507 2024-05-31 01:44:32,268 - mmdet - INFO - Epoch [3][1650/7330] lr: 1.000e-04, eta: 16:17:16, time: 0.815, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0503, loss_cls: 0.2239, acc: 92.1802, loss_bbox: 0.2711, loss_mask: 0.2698, loss: 0.8419 2024-05-31 01:45:14,018 - mmdet - INFO - Epoch [3][1700/7330] lr: 1.000e-04, eta: 16:16:39, time: 0.835, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0566, loss_cls: 0.2241, acc: 92.0679, loss_bbox: 0.2738, loss_mask: 0.2671, loss: 0.8496 2024-05-31 01:45:54,630 - mmdet - INFO - Epoch [3][1750/7330] lr: 1.000e-04, eta: 16:15:56, time: 0.812, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0534, loss_cls: 0.2296, acc: 91.9661, loss_bbox: 0.2819, loss_mask: 0.2730, loss: 0.8656 2024-05-31 01:46:34,926 - mmdet - INFO - Epoch [3][1800/7330] lr: 1.000e-04, eta: 16:15:13, time: 0.806, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0484, loss_cls: 0.2106, acc: 92.6333, loss_bbox: 0.2566, loss_mask: 0.2567, loss: 0.7959 2024-05-31 01:47:15,834 - mmdet - INFO - Epoch [3][1850/7330] lr: 1.000e-04, eta: 16:14:32, time: 0.818, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0537, loss_cls: 0.2243, acc: 92.1501, loss_bbox: 0.2686, loss_mask: 0.2669, loss: 0.8400 2024-05-31 01:47:56,088 - mmdet - INFO - Epoch [3][1900/7330] lr: 1.000e-04, eta: 16:13:48, time: 0.805, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0490, loss_cls: 0.2034, acc: 92.8567, loss_bbox: 0.2527, loss_mask: 0.2552, loss: 0.7846 2024-05-31 01:48:37,109 - mmdet - INFO - Epoch [3][1950/7330] lr: 1.000e-04, eta: 16:13:08, time: 0.820, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0547, loss_cls: 0.2226, acc: 92.0862, loss_bbox: 0.2770, loss_mask: 0.2652, loss: 0.8476 2024-05-31 01:49:17,650 - mmdet - INFO - Epoch [3][2000/7330] lr: 1.000e-04, eta: 16:12:25, time: 0.811, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0509, loss_cls: 0.2132, acc: 92.5625, loss_bbox: 0.2633, loss_mask: 0.2623, loss: 0.8143 2024-05-31 01:49:58,448 - mmdet - INFO - Epoch [3][2050/7330] lr: 1.000e-04, eta: 16:11:44, time: 0.816, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0505, loss_cls: 0.2162, acc: 92.2600, loss_bbox: 0.2713, loss_mask: 0.2641, loss: 0.8277 2024-05-31 01:50:38,939 - mmdet - INFO - Epoch [3][2100/7330] lr: 1.000e-04, eta: 16:11:01, time: 0.810, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0515, loss_cls: 0.2206, acc: 92.1931, loss_bbox: 0.2748, loss_mask: 0.2626, loss: 0.8342 2024-05-31 01:51:25,833 - mmdet - INFO - Epoch [3][2150/7330] lr: 1.000e-04, eta: 16:10:45, time: 0.938, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0519, loss_cls: 0.2241, acc: 91.9751, loss_bbox: 0.2786, loss_mask: 0.2629, loss: 0.8443 2024-05-31 01:52:09,489 - mmdet - INFO - Epoch [3][2200/7330] lr: 1.000e-04, eta: 16:10:16, time: 0.873, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0544, loss_cls: 0.2261, acc: 92.0623, loss_bbox: 0.2786, loss_mask: 0.2707, loss: 0.8587 2024-05-31 01:52:54,560 - mmdet - INFO - Epoch [3][2250/7330] lr: 1.000e-04, eta: 16:09:52, time: 0.901, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0528, loss_cls: 0.2225, acc: 92.2000, loss_bbox: 0.2749, loss_mask: 0.2631, loss: 0.8397 2024-05-31 01:53:37,766 - mmdet - INFO - Epoch [3][2300/7330] lr: 1.000e-04, eta: 16:09:21, time: 0.864, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0522, loss_cls: 0.2252, acc: 92.2068, loss_bbox: 0.2732, loss_mask: 0.2656, loss: 0.8424 2024-05-31 01:54:17,820 - mmdet - INFO - Epoch [3][2350/7330] lr: 1.000e-04, eta: 16:08:36, time: 0.801, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0507, loss_cls: 0.2098, acc: 92.6135, loss_bbox: 0.2599, loss_mask: 0.2606, loss: 0.8048 2024-05-31 01:54:58,730 - mmdet - INFO - Epoch [3][2400/7330] lr: 1.000e-04, eta: 16:07:55, time: 0.818, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0528, loss_cls: 0.2201, acc: 92.1838, loss_bbox: 0.2753, loss_mask: 0.2641, loss: 0.8378 2024-05-31 01:55:38,841 - mmdet - INFO - Epoch [3][2450/7330] lr: 1.000e-04, eta: 16:07:10, time: 0.802, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0515, loss_cls: 0.2107, acc: 92.3704, loss_bbox: 0.2632, loss_mask: 0.2609, loss: 0.8144 2024-05-31 01:56:19,602 - mmdet - INFO - Epoch [3][2500/7330] lr: 1.000e-04, eta: 16:06:29, time: 0.815, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0511, loss_cls: 0.2218, acc: 92.1870, loss_bbox: 0.2758, loss_mask: 0.2666, loss: 0.8395 2024-05-31 01:57:00,913 - mmdet - INFO - Epoch [3][2550/7330] lr: 1.000e-04, eta: 16:05:49, time: 0.826, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0561, loss_cls: 0.2192, acc: 92.1885, loss_bbox: 0.2677, loss_mask: 0.2599, loss: 0.8298 2024-05-31 01:57:43,517 - mmdet - INFO - Epoch [3][2600/7330] lr: 1.000e-04, eta: 16:05:15, time: 0.852, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0510, loss_cls: 0.2185, acc: 92.2319, loss_bbox: 0.2773, loss_mask: 0.2652, loss: 0.8373 2024-05-31 01:58:26,413 - mmdet - INFO - Epoch [3][2650/7330] lr: 1.000e-04, eta: 16:04:42, time: 0.858, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0498, loss_cls: 0.2147, acc: 92.3679, loss_bbox: 0.2680, loss_mask: 0.2631, loss: 0.8215 2024-05-31 01:59:07,419 - mmdet - INFO - Epoch [3][2700/7330] lr: 1.000e-04, eta: 16:04:01, time: 0.820, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0521, loss_cls: 0.2261, acc: 92.0076, loss_bbox: 0.2745, loss_mask: 0.2634, loss: 0.8420 2024-05-31 01:59:48,596 - mmdet - INFO - Epoch [3][2750/7330] lr: 1.000e-04, eta: 16:03:21, time: 0.824, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0520, loss_cls: 0.2232, acc: 92.1140, loss_bbox: 0.2751, loss_mask: 0.2661, loss: 0.8423 2024-05-31 02:00:29,268 - mmdet - INFO - Epoch [3][2800/7330] lr: 1.000e-04, eta: 16:02:39, time: 0.813, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0502, loss_cls: 0.2191, acc: 92.2964, loss_bbox: 0.2655, loss_mask: 0.2588, loss: 0.8187 2024-05-31 02:01:09,893 - mmdet - INFO - Epoch [3][2850/7330] lr: 1.000e-04, eta: 16:01:57, time: 0.813, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0523, loss_cls: 0.2193, acc: 92.2375, loss_bbox: 0.2682, loss_mask: 0.2589, loss: 0.8247 2024-05-31 02:01:50,292 - mmdet - INFO - Epoch [3][2900/7330] lr: 1.000e-04, eta: 16:01:13, time: 0.808, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0511, loss_cls: 0.2256, acc: 92.2363, loss_bbox: 0.2704, loss_mask: 0.2612, loss: 0.8358 2024-05-31 02:02:30,947 - mmdet - INFO - Epoch [3][2950/7330] lr: 1.000e-04, eta: 16:00:31, time: 0.813, data_time: 0.041, memory: 18874, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0547, loss_cls: 0.2216, acc: 92.1838, loss_bbox: 0.2721, loss_mask: 0.2571, loss: 0.8329 2024-05-31 02:03:11,117 - mmdet - INFO - Epoch [3][3000/7330] lr: 1.000e-04, eta: 15:59:47, time: 0.803, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0501, loss_cls: 0.2135, acc: 92.4648, loss_bbox: 0.2663, loss_mask: 0.2612, loss: 0.8150 2024-05-31 02:03:51,296 - mmdet - INFO - Epoch [3][3050/7330] lr: 1.000e-04, eta: 15:59:03, time: 0.804, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0497, loss_cls: 0.2123, acc: 92.5840, loss_bbox: 0.2530, loss_mask: 0.2584, loss: 0.7981 2024-05-31 02:04:31,975 - mmdet - INFO - Epoch [3][3100/7330] lr: 1.000e-04, eta: 15:58:21, time: 0.814, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0542, loss_cls: 0.2241, acc: 92.0935, loss_bbox: 0.2770, loss_mask: 0.2629, loss: 0.8458 2024-05-31 02:05:12,474 - mmdet - INFO - Epoch [3][3150/7330] lr: 1.000e-04, eta: 15:57:38, time: 0.810, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0513, loss_cls: 0.2106, acc: 92.4993, loss_bbox: 0.2586, loss_mask: 0.2666, loss: 0.8110 2024-05-31 02:05:53,077 - mmdet - INFO - Epoch [3][3200/7330] lr: 1.000e-04, eta: 15:56:56, time: 0.812, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0518, loss_cls: 0.2143, acc: 92.5051, loss_bbox: 0.2661, loss_mask: 0.2613, loss: 0.8194 2024-05-31 02:06:38,559 - mmdet - INFO - Epoch [3][3250/7330] lr: 1.000e-04, eta: 15:56:33, time: 0.910, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0545, loss_cls: 0.2267, acc: 92.0698, loss_bbox: 0.2787, loss_mask: 0.2685, loss: 0.8547 2024-05-31 02:07:26,148 - mmdet - INFO - Epoch [3][3300/7330] lr: 1.000e-04, eta: 15:56:17, time: 0.952, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0511, loss_cls: 0.2142, acc: 92.5349, loss_bbox: 0.2610, loss_mask: 0.2596, loss: 0.8113 2024-05-31 02:08:10,212 - mmdet - INFO - Epoch [3][3350/7330] lr: 1.000e-04, eta: 15:55:48, time: 0.881, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0501, loss_cls: 0.2106, acc: 92.5278, loss_bbox: 0.2598, loss_mask: 0.2619, loss: 0.8065 2024-05-31 02:08:51,811 - mmdet - INFO - Epoch [3][3400/7330] lr: 1.000e-04, eta: 15:55:10, time: 0.832, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0536, loss_cls: 0.2255, acc: 91.9866, loss_bbox: 0.2817, loss_mask: 0.2669, loss: 0.8560 2024-05-31 02:09:32,796 - mmdet - INFO - Epoch [3][3450/7330] lr: 1.000e-04, eta: 15:54:29, time: 0.820, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0528, loss_cls: 0.2288, acc: 91.9377, loss_bbox: 0.2741, loss_mask: 0.2675, loss: 0.8491 2024-05-31 02:10:12,900 - mmdet - INFO - Epoch [3][3500/7330] lr: 1.000e-04, eta: 15:53:44, time: 0.802, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0522, loss_cls: 0.2153, acc: 92.4421, loss_bbox: 0.2645, loss_mask: 0.2625, loss: 0.8200 2024-05-31 02:10:53,451 - mmdet - INFO - Epoch [3][3550/7330] lr: 1.000e-04, eta: 15:53:02, time: 0.811, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0489, loss_cls: 0.2185, acc: 92.3025, loss_bbox: 0.2696, loss_mask: 0.2565, loss: 0.8179 2024-05-31 02:11:33,980 - mmdet - INFO - Epoch [3][3600/7330] lr: 1.000e-04, eta: 15:52:19, time: 0.811, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0483, loss_cls: 0.2159, acc: 92.3909, loss_bbox: 0.2634, loss_mask: 0.2613, loss: 0.8145 2024-05-31 02:12:16,683 - mmdet - INFO - Epoch [3][3650/7330] lr: 1.000e-04, eta: 15:51:44, time: 0.854, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0525, loss_cls: 0.2212, acc: 92.1135, loss_bbox: 0.2784, loss_mask: 0.2661, loss: 0.8443 2024-05-31 02:12:59,850 - mmdet - INFO - Epoch [3][3700/7330] lr: 1.000e-04, eta: 15:51:12, time: 0.863, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0500, loss_cls: 0.2166, acc: 92.3582, loss_bbox: 0.2583, loss_mask: 0.2625, loss: 0.8129 2024-05-31 02:13:40,802 - mmdet - INFO - Epoch [3][3750/7330] lr: 1.000e-04, eta: 15:50:30, time: 0.819, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0506, loss_cls: 0.2189, acc: 92.3953, loss_bbox: 0.2658, loss_mask: 0.2690, loss: 0.8300 2024-05-31 02:14:21,529 - mmdet - INFO - Epoch [3][3800/7330] lr: 1.000e-04, eta: 15:49:48, time: 0.815, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0533, loss_cls: 0.2220, acc: 92.1411, loss_bbox: 0.2753, loss_mask: 0.2663, loss: 0.8421 2024-05-31 02:15:01,781 - mmdet - INFO - Epoch [3][3850/7330] lr: 1.000e-04, eta: 15:49:05, time: 0.805, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0513, loss_cls: 0.2169, acc: 92.2451, loss_bbox: 0.2680, loss_mask: 0.2607, loss: 0.8238 2024-05-31 02:15:42,675 - mmdet - INFO - Epoch [3][3900/7330] lr: 1.000e-04, eta: 15:48:23, time: 0.818, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0518, loss_cls: 0.2069, acc: 92.6555, loss_bbox: 0.2592, loss_mask: 0.2576, loss: 0.8008 2024-05-31 02:16:22,598 - mmdet - INFO - Epoch [3][3950/7330] lr: 1.000e-04, eta: 15:47:38, time: 0.798, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0510, loss_cls: 0.2167, acc: 92.3408, loss_bbox: 0.2688, loss_mask: 0.2628, loss: 0.8258 2024-05-31 02:17:03,022 - mmdet - INFO - Epoch [3][4000/7330] lr: 1.000e-04, eta: 15:46:55, time: 0.808, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0515, loss_cls: 0.2202, acc: 92.2209, loss_bbox: 0.2643, loss_mask: 0.2641, loss: 0.8265 2024-05-31 02:17:43,602 - mmdet - INFO - Epoch [3][4050/7330] lr: 1.000e-04, eta: 15:46:13, time: 0.812, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0532, loss_cls: 0.2170, acc: 92.4441, loss_bbox: 0.2560, loss_mask: 0.2551, loss: 0.8069 2024-05-31 02:18:24,500 - mmdet - INFO - Epoch [3][4100/7330] lr: 1.000e-04, eta: 15:45:31, time: 0.818, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0536, loss_cls: 0.2275, acc: 91.9368, loss_bbox: 0.2794, loss_mask: 0.2701, loss: 0.8571 2024-05-31 02:19:04,843 - mmdet - INFO - Epoch [3][4150/7330] lr: 1.000e-04, eta: 15:44:48, time: 0.807, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0488, loss_cls: 0.2100, acc: 92.5225, loss_bbox: 0.2634, loss_mask: 0.2563, loss: 0.8026 2024-05-31 02:19:45,857 - mmdet - INFO - Epoch [3][4200/7330] lr: 1.000e-04, eta: 15:44:07, time: 0.820, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0539, loss_cls: 0.2231, acc: 92.2039, loss_bbox: 0.2730, loss_mask: 0.2678, loss: 0.8449 2024-05-31 02:20:26,178 - mmdet - INFO - Epoch [3][4250/7330] lr: 1.000e-04, eta: 15:43:24, time: 0.806, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0503, loss_cls: 0.2095, acc: 92.6997, loss_bbox: 0.2539, loss_mask: 0.2663, loss: 0.8032 2024-05-31 02:21:12,975 - mmdet - INFO - Epoch [3][4300/7330] lr: 1.000e-04, eta: 15:43:04, time: 0.936, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0518, loss_cls: 0.2187, acc: 92.2068, loss_bbox: 0.2691, loss_mask: 0.2594, loss: 0.8250 2024-05-31 02:21:59,280 - mmdet - INFO - Epoch [3][4350/7330] lr: 1.000e-04, eta: 15:42:42, time: 0.926, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0490, loss_cls: 0.2147, acc: 92.2622, loss_bbox: 0.2680, loss_mask: 0.2655, loss: 0.8208 2024-05-31 02:22:44,029 - mmdet - INFO - Epoch [3][4400/7330] lr: 1.000e-04, eta: 15:42:14, time: 0.895, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0472, loss_cls: 0.2109, acc: 92.6655, loss_bbox: 0.2536, loss_mask: 0.2516, loss: 0.7874 2024-05-31 02:23:23,874 - mmdet - INFO - Epoch [3][4450/7330] lr: 1.000e-04, eta: 15:41:29, time: 0.797, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0510, loss_cls: 0.2145, acc: 92.3745, loss_bbox: 0.2695, loss_mask: 0.2666, loss: 0.8263 2024-05-31 02:24:04,685 - mmdet - INFO - Epoch [3][4500/7330] lr: 1.000e-04, eta: 15:40:47, time: 0.816, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0514, loss_cls: 0.2316, acc: 91.7595, loss_bbox: 0.2810, loss_mask: 0.2675, loss: 0.8577 2024-05-31 02:24:44,653 - mmdet - INFO - Epoch [3][4550/7330] lr: 1.000e-04, eta: 15:40:03, time: 0.799, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0485, loss_cls: 0.2093, acc: 92.7083, loss_bbox: 0.2535, loss_mask: 0.2619, loss: 0.7959 2024-05-31 02:25:26,194 - mmdet - INFO - Epoch [3][4600/7330] lr: 1.000e-04, eta: 15:39:23, time: 0.831, data_time: 0.075, memory: 18874, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0522, loss_cls: 0.2288, acc: 92.0801, loss_bbox: 0.2725, loss_mask: 0.2639, loss: 0.8431 2024-05-31 02:26:07,597 - mmdet - INFO - Epoch [3][4650/7330] lr: 1.000e-04, eta: 15:38:44, time: 0.828, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0543, loss_cls: 0.2184, acc: 92.3237, loss_bbox: 0.2712, loss_mask: 0.2644, loss: 0.8359 2024-05-31 02:26:52,011 - mmdet - INFO - Epoch [3][4700/7330] lr: 1.000e-04, eta: 15:38:15, time: 0.888, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0551, loss_cls: 0.2318, acc: 91.8103, loss_bbox: 0.2896, loss_mask: 0.2706, loss: 0.8756 2024-05-31 02:27:35,976 - mmdet - INFO - Epoch [3][4750/7330] lr: 1.000e-04, eta: 15:37:44, time: 0.879, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0539, loss_cls: 0.2193, acc: 92.0764, loss_bbox: 0.2738, loss_mask: 0.2666, loss: 0.8405 2024-05-31 02:28:16,729 - mmdet - INFO - Epoch [3][4800/7330] lr: 1.000e-04, eta: 15:37:02, time: 0.815, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0521, loss_cls: 0.2188, acc: 92.3206, loss_bbox: 0.2711, loss_mask: 0.2613, loss: 0.8274 2024-05-31 02:28:57,769 - mmdet - INFO - Epoch [3][4850/7330] lr: 1.000e-04, eta: 15:36:21, time: 0.821, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0509, loss_cls: 0.2230, acc: 92.0913, loss_bbox: 0.2716, loss_mask: 0.2657, loss: 0.8367 2024-05-31 02:29:38,749 - mmdet - INFO - Epoch [3][4900/7330] lr: 1.000e-04, eta: 15:35:40, time: 0.820, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0473, loss_cls: 0.2068, acc: 92.6262, loss_bbox: 0.2611, loss_mask: 0.2591, loss: 0.7976 2024-05-31 02:30:19,029 - mmdet - INFO - Epoch [3][4950/7330] lr: 1.000e-04, eta: 15:34:56, time: 0.806, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0511, loss_cls: 0.2146, acc: 92.3469, loss_bbox: 0.2703, loss_mask: 0.2571, loss: 0.8175 2024-05-31 02:30:59,114 - mmdet - INFO - Epoch [3][5000/7330] lr: 1.000e-04, eta: 15:34:12, time: 0.802, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0513, loss_cls: 0.2212, acc: 92.2305, loss_bbox: 0.2721, loss_mask: 0.2660, loss: 0.8361 2024-05-31 02:31:39,569 - mmdet - INFO - Epoch [3][5050/7330] lr: 1.000e-04, eta: 15:33:29, time: 0.809, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0512, loss_cls: 0.2177, acc: 92.4194, loss_bbox: 0.2651, loss_mask: 0.2580, loss: 0.8185 2024-05-31 02:32:20,098 - mmdet - INFO - Epoch [3][5100/7330] lr: 1.000e-04, eta: 15:32:46, time: 0.811, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0527, loss_cls: 0.2206, acc: 92.2725, loss_bbox: 0.2722, loss_mask: 0.2657, loss: 0.8372 2024-05-31 02:33:01,138 - mmdet - INFO - Epoch [3][5150/7330] lr: 1.000e-04, eta: 15:32:05, time: 0.821, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0539, loss_cls: 0.2248, acc: 92.1184, loss_bbox: 0.2747, loss_mask: 0.2642, loss: 0.8445 2024-05-31 02:33:42,476 - mmdet - INFO - Epoch [3][5200/7330] lr: 1.000e-04, eta: 15:31:25, time: 0.827, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0520, loss_cls: 0.2105, acc: 92.4983, loss_bbox: 0.2653, loss_mask: 0.2645, loss: 0.8188 2024-05-31 02:34:22,957 - mmdet - INFO - Epoch [3][5250/7330] lr: 1.000e-04, eta: 15:30:42, time: 0.810, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0519, loss_cls: 0.2137, acc: 92.4285, loss_bbox: 0.2708, loss_mask: 0.2670, loss: 0.8303 2024-05-31 02:35:04,112 - mmdet - INFO - Epoch [3][5300/7330] lr: 1.000e-04, eta: 15:30:01, time: 0.823, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0532, loss_cls: 0.2180, acc: 92.2583, loss_bbox: 0.2677, loss_mask: 0.2650, loss: 0.8283 2024-05-31 02:35:49,565 - mmdet - INFO - Epoch [3][5350/7330] lr: 1.000e-04, eta: 15:29:35, time: 0.909, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0502, loss_cls: 0.2116, acc: 92.5752, loss_bbox: 0.2606, loss_mask: 0.2587, loss: 0.8066 2024-05-31 02:36:34,466 - mmdet - INFO - Epoch [3][5400/7330] lr: 1.000e-04, eta: 15:29:07, time: 0.898, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0513, loss_cls: 0.2160, acc: 92.5012, loss_bbox: 0.2607, loss_mask: 0.2588, loss: 0.8124 2024-05-31 02:37:20,168 - mmdet - INFO - Epoch [3][5450/7330] lr: 1.000e-04, eta: 15:28:42, time: 0.914, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0515, loss_cls: 0.2093, acc: 92.5239, loss_bbox: 0.2585, loss_mask: 0.2564, loss: 0.8001 2024-05-31 02:38:00,907 - mmdet - INFO - Epoch [3][5500/7330] lr: 1.000e-04, eta: 15:28:00, time: 0.815, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0489, loss_cls: 0.2223, acc: 92.1819, loss_bbox: 0.2751, loss_mask: 0.2593, loss: 0.8285 2024-05-31 02:38:40,855 - mmdet - INFO - Epoch [3][5550/7330] lr: 1.000e-04, eta: 15:27:15, time: 0.799, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0486, loss_cls: 0.2137, acc: 92.4800, loss_bbox: 0.2588, loss_mask: 0.2623, loss: 0.8088 2024-05-31 02:39:21,635 - mmdet - INFO - Epoch [3][5600/7330] lr: 1.000e-04, eta: 15:26:33, time: 0.816, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0281, loss_rpn_bbox: 0.0541, loss_cls: 0.2226, acc: 92.1475, loss_bbox: 0.2738, loss_mask: 0.2671, loss: 0.8457 2024-05-31 02:40:02,293 - mmdet - INFO - Epoch [3][5650/7330] lr: 1.000e-04, eta: 15:25:51, time: 0.813, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0503, loss_cls: 0.2213, acc: 92.1648, loss_bbox: 0.2753, loss_mask: 0.2675, loss: 0.8407 2024-05-31 02:40:42,956 - mmdet - INFO - Epoch [3][5700/7330] lr: 1.000e-04, eta: 15:25:08, time: 0.813, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0534, loss_cls: 0.2219, acc: 92.1274, loss_bbox: 0.2758, loss_mask: 0.2651, loss: 0.8431 2024-05-31 02:41:23,687 - mmdet - INFO - Epoch [3][5750/7330] lr: 1.000e-04, eta: 15:24:26, time: 0.815, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0496, loss_cls: 0.2188, acc: 92.3091, loss_bbox: 0.2681, loss_mask: 0.2644, loss: 0.8257 2024-05-31 02:42:09,712 - mmdet - INFO - Epoch [3][5800/7330] lr: 1.000e-04, eta: 15:24:02, time: 0.921, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0538, loss_cls: 0.2137, acc: 92.3142, loss_bbox: 0.2700, loss_mask: 0.2621, loss: 0.8255 2024-05-31 02:42:49,929 - mmdet - INFO - Epoch [3][5850/7330] lr: 1.000e-04, eta: 15:23:18, time: 0.804, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0494, loss_cls: 0.2126, acc: 92.5823, loss_bbox: 0.2607, loss_mask: 0.2594, loss: 0.8066 2024-05-31 02:43:30,592 - mmdet - INFO - Epoch [3][5900/7330] lr: 1.000e-04, eta: 15:22:35, time: 0.813, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0506, loss_cls: 0.2175, acc: 92.3950, loss_bbox: 0.2637, loss_mask: 0.2629, loss: 0.8182 2024-05-31 02:44:11,258 - mmdet - INFO - Epoch [3][5950/7330] lr: 1.000e-04, eta: 15:21:53, time: 0.813, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0519, loss_cls: 0.2099, acc: 92.5791, loss_bbox: 0.2606, loss_mask: 0.2542, loss: 0.8027 2024-05-31 02:44:52,326 - mmdet - INFO - Epoch [3][6000/7330] lr: 1.000e-04, eta: 15:21:12, time: 0.821, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0508, loss_cls: 0.2074, acc: 92.5662, loss_bbox: 0.2557, loss_mask: 0.2531, loss: 0.7913 2024-05-31 02:45:32,424 - mmdet - INFO - Epoch [3][6050/7330] lr: 1.000e-04, eta: 15:20:28, time: 0.802, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0501, loss_cls: 0.2167, acc: 92.3188, loss_bbox: 0.2662, loss_mask: 0.2648, loss: 0.8216 2024-05-31 02:46:13,122 - mmdet - INFO - Epoch [3][6100/7330] lr: 1.000e-04, eta: 15:19:45, time: 0.814, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0556, loss_cls: 0.2151, acc: 92.3218, loss_bbox: 0.2704, loss_mask: 0.2644, loss: 0.8319 2024-05-31 02:46:53,159 - mmdet - INFO - Epoch [3][6150/7330] lr: 1.000e-04, eta: 15:19:01, time: 0.801, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0530, loss_cls: 0.2139, acc: 92.4377, loss_bbox: 0.2674, loss_mask: 0.2592, loss: 0.8204 2024-05-31 02:47:33,439 - mmdet - INFO - Epoch [3][6200/7330] lr: 1.000e-04, eta: 15:18:17, time: 0.806, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0537, loss_cls: 0.2164, acc: 92.3689, loss_bbox: 0.2657, loss_mask: 0.2616, loss: 0.8239 2024-05-31 02:48:14,305 - mmdet - INFO - Epoch [3][6250/7330] lr: 1.000e-04, eta: 15:17:36, time: 0.817, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0505, loss_cls: 0.2105, acc: 92.5251, loss_bbox: 0.2601, loss_mask: 0.2597, loss: 0.8077 2024-05-31 02:48:54,158 - mmdet - INFO - Epoch [3][6300/7330] lr: 1.000e-04, eta: 15:16:51, time: 0.797, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0487, loss_cls: 0.2063, acc: 92.5273, loss_bbox: 0.2631, loss_mask: 0.2596, loss: 0.8028 2024-05-31 02:49:34,999 - mmdet - INFO - Epoch [3][6350/7330] lr: 1.000e-04, eta: 15:16:09, time: 0.817, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0523, loss_cls: 0.2087, acc: 92.6042, loss_bbox: 0.2643, loss_mask: 0.2592, loss: 0.8096 2024-05-31 02:50:15,232 - mmdet - INFO - Epoch [3][6400/7330] lr: 1.000e-04, eta: 15:15:25, time: 0.805, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0506, loss_cls: 0.2181, acc: 92.2913, loss_bbox: 0.2647, loss_mask: 0.2603, loss: 0.8186 2024-05-31 02:51:01,217 - mmdet - INFO - Epoch [3][6450/7330] lr: 1.000e-04, eta: 15:15:00, time: 0.920, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0549, loss_cls: 0.2112, acc: 92.5774, loss_bbox: 0.2583, loss_mask: 0.2629, loss: 0.8141 2024-05-31 02:51:48,329 - mmdet - INFO - Epoch [3][6500/7330] lr: 1.000e-04, eta: 15:14:38, time: 0.942, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0495, loss_cls: 0.2113, acc: 92.5383, loss_bbox: 0.2582, loss_mask: 0.2581, loss: 0.8021 2024-05-31 02:52:31,354 - mmdet - INFO - Epoch [3][6550/7330] lr: 1.000e-04, eta: 15:14:03, time: 0.861, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0492, loss_cls: 0.2054, acc: 92.6523, loss_bbox: 0.2601, loss_mask: 0.2582, loss: 0.7990 2024-05-31 02:53:11,833 - mmdet - INFO - Epoch [3][6600/7330] lr: 1.000e-04, eta: 15:13:20, time: 0.809, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0532, loss_cls: 0.2191, acc: 92.3145, loss_bbox: 0.2663, loss_mask: 0.2652, loss: 0.8281 2024-05-31 02:53:52,299 - mmdet - INFO - Epoch [3][6650/7330] lr: 1.000e-04, eta: 15:12:37, time: 0.809, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0520, loss_cls: 0.2242, acc: 92.0784, loss_bbox: 0.2671, loss_mask: 0.2596, loss: 0.8283 2024-05-31 02:54:33,014 - mmdet - INFO - Epoch [3][6700/7330] lr: 1.000e-04, eta: 15:11:55, time: 0.814, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0517, loss_cls: 0.2231, acc: 92.0786, loss_bbox: 0.2766, loss_mask: 0.2581, loss: 0.8348 2024-05-31 02:55:12,860 - mmdet - INFO - Epoch [3][6750/7330] lr: 1.000e-04, eta: 15:11:10, time: 0.797, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0492, loss_cls: 0.2054, acc: 92.7063, loss_bbox: 0.2543, loss_mask: 0.2538, loss: 0.7858 2024-05-31 02:55:53,033 - mmdet - INFO - Epoch [3][6800/7330] lr: 1.000e-04, eta: 15:10:26, time: 0.803, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0533, loss_cls: 0.2156, acc: 92.3965, loss_bbox: 0.2654, loss_mask: 0.2661, loss: 0.8269 2024-05-31 02:56:36,352 - mmdet - INFO - Epoch [3][6850/7330] lr: 1.000e-04, eta: 15:09:52, time: 0.866, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0536, loss_cls: 0.2085, acc: 92.6777, loss_bbox: 0.2623, loss_mask: 0.2606, loss: 0.8142 2024-05-31 02:57:19,746 - mmdet - INFO - Epoch [3][6900/7330] lr: 1.000e-04, eta: 15:09:18, time: 0.868, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0520, loss_cls: 0.2124, acc: 92.4231, loss_bbox: 0.2684, loss_mask: 0.2664, loss: 0.8267 2024-05-31 02:58:00,950 - mmdet - INFO - Epoch [3][6950/7330] lr: 1.000e-04, eta: 15:08:37, time: 0.824, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0506, loss_cls: 0.2223, acc: 92.0979, loss_bbox: 0.2760, loss_mask: 0.2635, loss: 0.8384 2024-05-31 02:58:41,301 - mmdet - INFO - Epoch [3][7000/7330] lr: 1.000e-04, eta: 15:07:54, time: 0.807, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0499, loss_cls: 0.2086, acc: 92.6230, loss_bbox: 0.2537, loss_mask: 0.2533, loss: 0.7918 2024-05-31 02:59:21,676 - mmdet - INFO - Epoch [3][7050/7330] lr: 1.000e-04, eta: 15:07:11, time: 0.808, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0502, loss_cls: 0.2099, acc: 92.5918, loss_bbox: 0.2604, loss_mask: 0.2546, loss: 0.8005 2024-05-31 03:00:02,381 - mmdet - INFO - Epoch [3][7100/7330] lr: 1.000e-04, eta: 15:06:29, time: 0.814, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0552, loss_cls: 0.2278, acc: 91.9065, loss_bbox: 0.2775, loss_mask: 0.2662, loss: 0.8528 2024-05-31 03:00:42,981 - mmdet - INFO - Epoch [3][7150/7330] lr: 1.000e-04, eta: 15:05:46, time: 0.812, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0483, loss_cls: 0.2195, acc: 92.2251, loss_bbox: 0.2662, loss_mask: 0.2573, loss: 0.8168 2024-05-31 03:01:23,408 - mmdet - INFO - Epoch [3][7200/7330] lr: 1.000e-04, eta: 15:05:03, time: 0.809, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0498, loss_cls: 0.2145, acc: 92.4265, loss_bbox: 0.2671, loss_mask: 0.2628, loss: 0.8201 2024-05-31 03:02:03,585 - mmdet - INFO - Epoch [3][7250/7330] lr: 1.000e-04, eta: 15:04:19, time: 0.804, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0519, loss_cls: 0.2216, acc: 92.1213, loss_bbox: 0.2704, loss_mask: 0.2651, loss: 0.8353 2024-05-31 03:02:44,083 - mmdet - INFO - Epoch [3][7300/7330] lr: 1.000e-04, eta: 15:03:36, time: 0.810, data_time: 0.041, memory: 18874, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0516, loss_cls: 0.2125, acc: 92.4351, loss_bbox: 0.2673, loss_mask: 0.2571, loss: 0.8123 2024-05-31 03:03:08,929 - mmdet - INFO - Saving checkpoint at 3 epochs 2024-05-31 03:04:58,404 - mmdet - INFO - Evaluating bbox... 2024-05-31 03:05:25,455 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.416 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.665 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.453 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.253 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.464 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.570 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.543 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.543 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.543 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.353 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.588 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.705 2024-05-31 03:05:25,455 - mmdet - INFO - Evaluating segm... 2024-05-31 03:05:52,596 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.385 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.629 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.408 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.173 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.421 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.601 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.503 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.503 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.503 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.291 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.552 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.698 2024-05-31 03:05:52,950 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 03:05:52,951 - mmdet - INFO - Epoch(val) [3][625] bbox_mAP: 0.4160, bbox_mAP_50: 0.6650, bbox_mAP_75: 0.4530, bbox_mAP_s: 0.2530, bbox_mAP_m: 0.4640, bbox_mAP_l: 0.5700, bbox_mAP_copypaste: 0.416 0.665 0.453 0.253 0.464 0.570, segm_mAP: 0.3850, segm_mAP_50: 0.6290, segm_mAP_75: 0.4080, segm_mAP_s: 0.1730, segm_mAP_m: 0.4210, segm_mAP_l: 0.6010, segm_mAP_copypaste: 0.385 0.629 0.408 0.173 0.421 0.601 2024-05-31 03:06:43,761 - mmdet - INFO - Epoch [4][50/7330] lr: 1.000e-04, eta: 15:01:46, time: 1.016, data_time: 0.133, memory: 18874, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0524, loss_cls: 0.2069, acc: 92.4792, loss_bbox: 0.2594, loss_mask: 0.2591, loss: 0.8012 2024-05-31 03:07:26,622 - mmdet - INFO - Epoch [4][100/7330] lr: 1.000e-04, eta: 15:01:10, time: 0.857, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0482, loss_cls: 0.2096, acc: 92.4780, loss_bbox: 0.2588, loss_mask: 0.2591, loss: 0.7998 2024-05-31 03:08:11,943 - mmdet - INFO - Epoch [4][150/7330] lr: 1.000e-04, eta: 15:00:42, time: 0.906, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0494, loss_cls: 0.2052, acc: 92.6714, loss_bbox: 0.2634, loss_mask: 0.2536, loss: 0.7942 2024-05-31 03:08:57,166 - mmdet - INFO - Epoch [4][200/7330] lr: 1.000e-04, eta: 15:00:13, time: 0.904, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0495, loss_cls: 0.2076, acc: 92.5518, loss_bbox: 0.2601, loss_mask: 0.2552, loss: 0.7944 2024-05-31 03:09:37,797 - mmdet - INFO - Epoch [4][250/7330] lr: 1.000e-04, eta: 14:59:31, time: 0.813, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0491, loss_cls: 0.1949, acc: 92.9553, loss_bbox: 0.2504, loss_mask: 0.2464, loss: 0.7632 2024-05-31 03:10:18,653 - mmdet - INFO - Epoch [4][300/7330] lr: 1.000e-04, eta: 14:58:49, time: 0.817, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0499, loss_cls: 0.2143, acc: 92.2290, loss_bbox: 0.2653, loss_mask: 0.2566, loss: 0.8107 2024-05-31 03:10:58,799 - mmdet - INFO - Epoch [4][350/7330] lr: 1.000e-04, eta: 14:58:06, time: 0.803, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0472, loss_cls: 0.2030, acc: 92.7205, loss_bbox: 0.2546, loss_mask: 0.2503, loss: 0.7775 2024-05-31 03:11:41,635 - mmdet - INFO - Epoch [4][400/7330] lr: 1.000e-04, eta: 14:57:30, time: 0.857, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0503, loss_cls: 0.2072, acc: 92.4785, loss_bbox: 0.2644, loss_mask: 0.2581, loss: 0.8044 2024-05-31 03:12:21,718 - mmdet - INFO - Epoch [4][450/7330] lr: 1.000e-04, eta: 14:56:46, time: 0.802, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0491, loss_cls: 0.2064, acc: 92.4365, loss_bbox: 0.2679, loss_mask: 0.2551, loss: 0.8007 2024-05-31 03:13:04,219 - mmdet - INFO - Epoch [4][500/7330] lr: 1.000e-04, eta: 14:56:09, time: 0.850, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0478, loss_cls: 0.1997, acc: 92.9065, loss_bbox: 0.2555, loss_mask: 0.2530, loss: 0.7793 2024-05-31 03:13:44,895 - mmdet - INFO - Epoch [4][550/7330] lr: 1.000e-04, eta: 14:55:27, time: 0.813, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0525, loss_cls: 0.2154, acc: 92.2207, loss_bbox: 0.2668, loss_mask: 0.2572, loss: 0.8178 2024-05-31 03:14:25,210 - mmdet - INFO - Epoch [4][600/7330] lr: 1.000e-04, eta: 14:54:44, time: 0.806, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0485, loss_cls: 0.2143, acc: 92.3667, loss_bbox: 0.2644, loss_mask: 0.2585, loss: 0.8084 2024-05-31 03:15:05,371 - mmdet - INFO - Epoch [4][650/7330] lr: 1.000e-04, eta: 14:54:00, time: 0.803, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0471, loss_cls: 0.2019, acc: 92.6775, loss_bbox: 0.2562, loss_mask: 0.2514, loss: 0.7774 2024-05-31 03:15:45,491 - mmdet - INFO - Epoch [4][700/7330] lr: 1.000e-04, eta: 14:53:16, time: 0.802, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0467, loss_cls: 0.1973, acc: 92.9512, loss_bbox: 0.2465, loss_mask: 0.2486, loss: 0.7606 2024-05-31 03:16:25,489 - mmdet - INFO - Epoch [4][750/7330] lr: 1.000e-04, eta: 14:52:32, time: 0.800, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0503, loss_cls: 0.2139, acc: 92.3901, loss_bbox: 0.2624, loss_mask: 0.2542, loss: 0.8045 2024-05-31 03:17:05,720 - mmdet - INFO - Epoch [4][800/7330] lr: 1.000e-04, eta: 14:51:49, time: 0.805, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0489, loss_cls: 0.1975, acc: 92.8313, loss_bbox: 0.2517, loss_mask: 0.2493, loss: 0.7691 2024-05-31 03:17:45,998 - mmdet - INFO - Epoch [4][850/7330] lr: 1.000e-04, eta: 14:51:05, time: 0.806, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0504, loss_cls: 0.2198, acc: 92.0608, loss_bbox: 0.2736, loss_mask: 0.2599, loss: 0.8262 2024-05-31 03:18:26,266 - mmdet - INFO - Epoch [4][900/7330] lr: 1.000e-04, eta: 14:50:22, time: 0.805, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0453, loss_cls: 0.2023, acc: 92.7605, loss_bbox: 0.2532, loss_mask: 0.2512, loss: 0.7735 2024-05-31 03:19:07,039 - mmdet - INFO - Epoch [4][950/7330] lr: 1.000e-04, eta: 14:49:40, time: 0.816, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0495, loss_cls: 0.2047, acc: 92.6445, loss_bbox: 0.2599, loss_mask: 0.2514, loss: 0.7893 2024-05-31 03:19:47,183 - mmdet - INFO - Epoch [4][1000/7330] lr: 1.000e-04, eta: 14:48:57, time: 0.803, data_time: 0.075, memory: 18874, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0478, loss_cls: 0.2055, acc: 92.6484, loss_bbox: 0.2522, loss_mask: 0.2568, loss: 0.7868 2024-05-31 03:20:27,754 - mmdet - INFO - Epoch [4][1050/7330] lr: 1.000e-04, eta: 14:48:14, time: 0.811, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0516, loss_cls: 0.2042, acc: 92.7263, loss_bbox: 0.2569, loss_mask: 0.2513, loss: 0.7873 2024-05-31 03:21:11,194 - mmdet - INFO - Epoch [4][1100/7330] lr: 1.000e-04, eta: 14:47:40, time: 0.869, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0528, loss_cls: 0.2077, acc: 92.5037, loss_bbox: 0.2606, loss_mask: 0.2512, loss: 0.7956 2024-05-31 03:21:52,057 - mmdet - INFO - Epoch [4][1150/7330] lr: 1.000e-04, eta: 14:46:58, time: 0.817, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0478, loss_cls: 0.2068, acc: 92.5537, loss_bbox: 0.2597, loss_mask: 0.2562, loss: 0.7933 2024-05-31 03:22:40,175 - mmdet - INFO - Epoch [4][1200/7330] lr: 1.000e-04, eta: 14:46:37, time: 0.962, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0504, loss_cls: 0.2034, acc: 92.6135, loss_bbox: 0.2562, loss_mask: 0.2547, loss: 0.7877 2024-05-31 03:23:24,861 - mmdet - INFO - Epoch [4][1250/7330] lr: 1.000e-04, eta: 14:46:06, time: 0.894, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0470, loss_cls: 0.2073, acc: 92.6714, loss_bbox: 0.2538, loss_mask: 0.2517, loss: 0.7819 2024-05-31 03:24:05,257 - mmdet - INFO - Epoch [4][1300/7330] lr: 1.000e-04, eta: 14:45:23, time: 0.808, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0488, loss_cls: 0.2062, acc: 92.7000, loss_bbox: 0.2539, loss_mask: 0.2480, loss: 0.7803 2024-05-31 03:24:45,854 - mmdet - INFO - Epoch [4][1350/7330] lr: 1.000e-04, eta: 14:44:41, time: 0.812, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0505, loss_cls: 0.2132, acc: 92.3228, loss_bbox: 0.2663, loss_mask: 0.2558, loss: 0.8097 2024-05-31 03:25:25,956 - mmdet - INFO - Epoch [4][1400/7330] lr: 1.000e-04, eta: 14:43:57, time: 0.802, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0478, loss_cls: 0.2095, acc: 92.5364, loss_bbox: 0.2562, loss_mask: 0.2537, loss: 0.7911 2024-05-31 03:26:08,962 - mmdet - INFO - Epoch [4][1450/7330] lr: 1.000e-04, eta: 14:43:21, time: 0.860, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0516, loss_cls: 0.2067, acc: 92.5701, loss_bbox: 0.2572, loss_mask: 0.2565, loss: 0.7949 2024-05-31 03:26:49,414 - mmdet - INFO - Epoch [4][1500/7330] lr: 1.000e-04, eta: 14:42:38, time: 0.809, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0513, loss_cls: 0.2086, acc: 92.4377, loss_bbox: 0.2641, loss_mask: 0.2577, loss: 0.8042 2024-05-31 03:27:32,278 - mmdet - INFO - Epoch [4][1550/7330] lr: 1.000e-04, eta: 14:42:02, time: 0.857, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0526, loss_cls: 0.2028, acc: 92.6257, loss_bbox: 0.2569, loss_mask: 0.2533, loss: 0.7878 2024-05-31 03:28:12,990 - mmdet - INFO - Epoch [4][1600/7330] lr: 1.000e-04, eta: 14:41:20, time: 0.814, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0476, loss_cls: 0.2059, acc: 92.5435, loss_bbox: 0.2589, loss_mask: 0.2474, loss: 0.7826 2024-05-31 03:28:53,312 - mmdet - INFO - Epoch [4][1650/7330] lr: 1.000e-04, eta: 14:40:37, time: 0.806, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0465, loss_cls: 0.1930, acc: 93.0398, loss_bbox: 0.2447, loss_mask: 0.2521, loss: 0.7570 2024-05-31 03:29:33,631 - mmdet - INFO - Epoch [4][1700/7330] lr: 1.000e-04, eta: 14:39:54, time: 0.806, data_time: 0.074, memory: 18874, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0443, loss_cls: 0.2012, acc: 92.7349, loss_bbox: 0.2503, loss_mask: 0.2505, loss: 0.7674 2024-05-31 03:30:14,428 - mmdet - INFO - Epoch [4][1750/7330] lr: 1.000e-04, eta: 14:39:12, time: 0.816, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0491, loss_cls: 0.2081, acc: 92.6292, loss_bbox: 0.2575, loss_mask: 0.2529, loss: 0.7902 2024-05-31 03:30:55,100 - mmdet - INFO - Epoch [4][1800/7330] lr: 1.000e-04, eta: 14:38:30, time: 0.813, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0497, loss_cls: 0.2053, acc: 92.7151, loss_bbox: 0.2555, loss_mask: 0.2506, loss: 0.7821 2024-05-31 03:31:35,379 - mmdet - INFO - Epoch [4][1850/7330] lr: 1.000e-04, eta: 14:37:47, time: 0.806, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0486, loss_cls: 0.2088, acc: 92.4963, loss_bbox: 0.2589, loss_mask: 0.2549, loss: 0.7935 2024-05-31 03:32:15,397 - mmdet - INFO - Epoch [4][1900/7330] lr: 1.000e-04, eta: 14:37:03, time: 0.800, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0474, loss_cls: 0.2047, acc: 92.5999, loss_bbox: 0.2543, loss_mask: 0.2496, loss: 0.7780 2024-05-31 03:32:56,004 - mmdet - INFO - Epoch [4][1950/7330] lr: 1.000e-04, eta: 14:36:21, time: 0.812, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0518, loss_cls: 0.2050, acc: 92.6545, loss_bbox: 0.2608, loss_mask: 0.2511, loss: 0.7928 2024-05-31 03:33:36,043 - mmdet - INFO - Epoch [4][2000/7330] lr: 1.000e-04, eta: 14:35:37, time: 0.801, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0472, loss_cls: 0.1995, acc: 92.9026, loss_bbox: 0.2450, loss_mask: 0.2548, loss: 0.7705 2024-05-31 03:34:17,032 - mmdet - INFO - Epoch [4][2050/7330] lr: 1.000e-04, eta: 14:34:55, time: 0.820, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0529, loss_cls: 0.2136, acc: 92.3083, loss_bbox: 0.2671, loss_mask: 0.2609, loss: 0.8189 2024-05-31 03:34:57,485 - mmdet - INFO - Epoch [4][2100/7330] lr: 1.000e-04, eta: 14:34:13, time: 0.809, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0504, loss_cls: 0.2030, acc: 92.7324, loss_bbox: 0.2554, loss_mask: 0.2517, loss: 0.7836 2024-05-31 03:35:38,597 - mmdet - INFO - Epoch [4][2150/7330] lr: 1.000e-04, eta: 14:33:32, time: 0.822, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0492, loss_cls: 0.2110, acc: 92.4619, loss_bbox: 0.2583, loss_mask: 0.2496, loss: 0.7897 2024-05-31 03:36:21,437 - mmdet - INFO - Epoch [4][2200/7330] lr: 1.000e-04, eta: 14:32:55, time: 0.857, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0452, loss_cls: 0.1989, acc: 92.7959, loss_bbox: 0.2508, loss_mask: 0.2492, loss: 0.7634 2024-05-31 03:37:05,684 - mmdet - INFO - Epoch [4][2250/7330] lr: 1.000e-04, eta: 14:32:23, time: 0.885, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0479, loss_cls: 0.2046, acc: 92.5876, loss_bbox: 0.2615, loss_mask: 0.2576, loss: 0.7945 2024-05-31 03:37:55,990 - mmdet - INFO - Epoch [4][2300/7330] lr: 1.000e-04, eta: 14:32:06, time: 1.006, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0496, loss_cls: 0.1987, acc: 92.8591, loss_bbox: 0.2496, loss_mask: 0.2522, loss: 0.7718 2024-05-31 03:38:36,803 - mmdet - INFO - Epoch [4][2350/7330] lr: 1.000e-04, eta: 14:31:24, time: 0.816, data_time: 0.072, memory: 18874, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0484, loss_cls: 0.2060, acc: 92.6807, loss_bbox: 0.2544, loss_mask: 0.2505, loss: 0.7844 2024-05-31 03:39:16,875 - mmdet - INFO - Epoch [4][2400/7330] lr: 1.000e-04, eta: 14:30:40, time: 0.801, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0477, loss_cls: 0.2009, acc: 92.7839, loss_bbox: 0.2536, loss_mask: 0.2542, loss: 0.7799 2024-05-31 03:39:57,154 - mmdet - INFO - Epoch [4][2450/7330] lr: 1.000e-04, eta: 14:29:57, time: 0.806, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0488, loss_cls: 0.1991, acc: 92.7891, loss_bbox: 0.2509, loss_mask: 0.2469, loss: 0.7666 2024-05-31 03:40:38,095 - mmdet - INFO - Epoch [4][2500/7330] lr: 1.000e-04, eta: 14:29:16, time: 0.819, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0501, loss_cls: 0.2063, acc: 92.4683, loss_bbox: 0.2651, loss_mask: 0.2548, loss: 0.7990 2024-05-31 03:41:21,057 - mmdet - INFO - Epoch [4][2550/7330] lr: 1.000e-04, eta: 14:28:39, time: 0.859, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0492, loss_cls: 0.2082, acc: 92.4541, loss_bbox: 0.2643, loss_mask: 0.2518, loss: 0.7966 2024-05-31 03:42:01,262 - mmdet - INFO - Epoch [4][2600/7330] lr: 1.000e-04, eta: 14:27:56, time: 0.804, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0483, loss_cls: 0.2015, acc: 92.7068, loss_bbox: 0.2582, loss_mask: 0.2516, loss: 0.7820 2024-05-31 03:42:43,702 - mmdet - INFO - Epoch [4][2650/7330] lr: 1.000e-04, eta: 14:27:18, time: 0.849, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0468, loss_cls: 0.1951, acc: 93.0667, loss_bbox: 0.2401, loss_mask: 0.2445, loss: 0.7484 2024-05-31 03:43:24,650 - mmdet - INFO - Epoch [4][2700/7330] lr: 1.000e-04, eta: 14:26:37, time: 0.819, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0505, loss_cls: 0.2095, acc: 92.5774, loss_bbox: 0.2598, loss_mask: 0.2551, loss: 0.7981 2024-05-31 03:44:04,804 - mmdet - INFO - Epoch [4][2750/7330] lr: 1.000e-04, eta: 14:25:53, time: 0.803, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0465, loss_cls: 0.2080, acc: 92.5239, loss_bbox: 0.2569, loss_mask: 0.2531, loss: 0.7853 2024-05-31 03:44:44,976 - mmdet - INFO - Epoch [4][2800/7330] lr: 1.000e-04, eta: 14:25:10, time: 0.804, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0479, loss_cls: 0.2028, acc: 92.7263, loss_bbox: 0.2523, loss_mask: 0.2531, loss: 0.7772 2024-05-31 03:45:25,965 - mmdet - INFO - Epoch [4][2850/7330] lr: 1.000e-04, eta: 14:24:29, time: 0.820, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0506, loss_cls: 0.2052, acc: 92.5562, loss_bbox: 0.2597, loss_mask: 0.2557, loss: 0.7949 2024-05-31 03:46:06,326 - mmdet - INFO - Epoch [4][2900/7330] lr: 1.000e-04, eta: 14:23:46, time: 0.807, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0490, loss_cls: 0.2006, acc: 92.8059, loss_bbox: 0.2502, loss_mask: 0.2512, loss: 0.7730 2024-05-31 03:46:46,881 - mmdet - INFO - Epoch [4][2950/7330] lr: 1.000e-04, eta: 14:23:03, time: 0.811, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0478, loss_cls: 0.1961, acc: 92.7888, loss_bbox: 0.2528, loss_mask: 0.2519, loss: 0.7726 2024-05-31 03:47:27,475 - mmdet - INFO - Epoch [4][3000/7330] lr: 1.000e-04, eta: 14:22:21, time: 0.812, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0510, loss_cls: 0.2073, acc: 92.5356, loss_bbox: 0.2589, loss_mask: 0.2587, loss: 0.8009 2024-05-31 03:48:08,460 - mmdet - INFO - Epoch [4][3050/7330] lr: 1.000e-04, eta: 14:21:40, time: 0.820, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0477, loss_cls: 0.1951, acc: 92.9519, loss_bbox: 0.2450, loss_mask: 0.2459, loss: 0.7545 2024-05-31 03:48:48,538 - mmdet - INFO - Epoch [4][3100/7330] lr: 1.000e-04, eta: 14:20:56, time: 0.802, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0459, loss_cls: 0.1968, acc: 92.9729, loss_bbox: 0.2440, loss_mask: 0.2462, loss: 0.7548 2024-05-31 03:49:28,956 - mmdet - INFO - Epoch [4][3150/7330] lr: 1.000e-04, eta: 14:20:13, time: 0.808, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0496, loss_cls: 0.2077, acc: 92.5115, loss_bbox: 0.2597, loss_mask: 0.2487, loss: 0.7882 2024-05-31 03:50:10,143 - mmdet - INFO - Epoch [4][3200/7330] lr: 1.000e-04, eta: 14:19:33, time: 0.824, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0507, loss_cls: 0.2076, acc: 92.5483, loss_bbox: 0.2618, loss_mask: 0.2519, loss: 0.7965 2024-05-31 03:50:53,380 - mmdet - INFO - Epoch [4][3250/7330] lr: 1.000e-04, eta: 14:18:57, time: 0.865, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0490, loss_cls: 0.2071, acc: 92.4868, loss_bbox: 0.2599, loss_mask: 0.2536, loss: 0.7929 2024-05-31 03:51:37,489 - mmdet - INFO - Epoch [4][3300/7330] lr: 1.000e-04, eta: 14:18:23, time: 0.882, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0470, loss_cls: 0.2013, acc: 92.8093, loss_bbox: 0.2528, loss_mask: 0.2530, loss: 0.7764 2024-05-31 03:52:23,040 - mmdet - INFO - Epoch [4][3350/7330] lr: 1.000e-04, eta: 14:17:53, time: 0.911, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0461, loss_cls: 0.2025, acc: 92.7610, loss_bbox: 0.2496, loss_mask: 0.2464, loss: 0.7646 2024-05-31 03:53:07,933 - mmdet - INFO - Epoch [4][3400/7330] lr: 1.000e-04, eta: 14:17:21, time: 0.898, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0483, loss_cls: 0.2000, acc: 92.8550, loss_bbox: 0.2545, loss_mask: 0.2516, loss: 0.7776 2024-05-31 03:53:48,204 - mmdet - INFO - Epoch [4][3450/7330] lr: 1.000e-04, eta: 14:16:38, time: 0.805, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0499, loss_cls: 0.1984, acc: 92.8586, loss_bbox: 0.2534, loss_mask: 0.2493, loss: 0.7731 2024-05-31 03:54:28,861 - mmdet - INFO - Epoch [4][3500/7330] lr: 1.000e-04, eta: 14:15:56, time: 0.813, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0486, loss_cls: 0.2056, acc: 92.6719, loss_bbox: 0.2550, loss_mask: 0.2479, loss: 0.7789 2024-05-31 03:55:09,199 - mmdet - INFO - Epoch [4][3550/7330] lr: 1.000e-04, eta: 14:15:13, time: 0.807, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0489, loss_cls: 0.2034, acc: 92.6863, loss_bbox: 0.2569, loss_mask: 0.2546, loss: 0.7861 2024-05-31 03:55:51,332 - mmdet - INFO - Epoch [4][3600/7330] lr: 1.000e-04, eta: 14:14:34, time: 0.843, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0483, loss_cls: 0.1980, acc: 92.8999, loss_bbox: 0.2474, loss_mask: 0.2465, loss: 0.7623 2024-05-31 03:56:32,071 - mmdet - INFO - Epoch [4][3650/7330] lr: 1.000e-04, eta: 14:13:52, time: 0.815, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0506, loss_cls: 0.2100, acc: 92.4998, loss_bbox: 0.2582, loss_mask: 0.2595, loss: 0.8013 2024-05-31 03:57:14,828 - mmdet - INFO - Epoch [4][3700/7330] lr: 1.000e-04, eta: 14:13:15, time: 0.855, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0487, loss_cls: 0.2121, acc: 92.3279, loss_bbox: 0.2650, loss_mask: 0.2577, loss: 0.8038 2024-05-31 03:57:54,940 - mmdet - INFO - Epoch [4][3750/7330] lr: 1.000e-04, eta: 14:12:32, time: 0.802, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0489, loss_cls: 0.1970, acc: 92.8948, loss_bbox: 0.2458, loss_mask: 0.2498, loss: 0.7646 2024-05-31 03:58:35,286 - mmdet - INFO - Epoch [4][3800/7330] lr: 1.000e-04, eta: 14:11:49, time: 0.807, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0477, loss_cls: 0.2098, acc: 92.3542, loss_bbox: 0.2622, loss_mask: 0.2591, loss: 0.8022 2024-05-31 03:59:15,472 - mmdet - INFO - Epoch [4][3850/7330] lr: 1.000e-04, eta: 14:11:06, time: 0.804, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0494, loss_cls: 0.2007, acc: 92.9465, loss_bbox: 0.2475, loss_mask: 0.2524, loss: 0.7728 2024-05-31 03:59:55,795 - mmdet - INFO - Epoch [4][3900/7330] lr: 1.000e-04, eta: 14:10:23, time: 0.806, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0472, loss_cls: 0.1967, acc: 92.9309, loss_bbox: 0.2480, loss_mask: 0.2505, loss: 0.7629 2024-05-31 04:00:36,247 - mmdet - INFO - Epoch [4][3950/7330] lr: 1.000e-04, eta: 14:09:40, time: 0.809, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0478, loss_cls: 0.2049, acc: 92.7002, loss_bbox: 0.2522, loss_mask: 0.2529, loss: 0.7799 2024-05-31 04:01:17,985 - mmdet - INFO - Epoch [4][4000/7330] lr: 1.000e-04, eta: 14:09:00, time: 0.835, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0516, loss_cls: 0.2137, acc: 92.2322, loss_bbox: 0.2667, loss_mask: 0.2573, loss: 0.8135 2024-05-31 04:01:58,567 - mmdet - INFO - Epoch [4][4050/7330] lr: 1.000e-04, eta: 14:08:18, time: 0.812, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0480, loss_cls: 0.2056, acc: 92.5881, loss_bbox: 0.2564, loss_mask: 0.2485, loss: 0.7795 2024-05-31 04:02:39,250 - mmdet - INFO - Epoch [4][4100/7330] lr: 1.000e-04, eta: 14:07:36, time: 0.814, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0498, loss_cls: 0.2082, acc: 92.5364, loss_bbox: 0.2551, loss_mask: 0.2539, loss: 0.7913 2024-05-31 04:03:19,954 - mmdet - INFO - Epoch [4][4150/7330] lr: 1.000e-04, eta: 14:06:54, time: 0.814, data_time: 0.074, memory: 18874, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0521, loss_cls: 0.2066, acc: 92.5217, loss_bbox: 0.2628, loss_mask: 0.2573, loss: 0.8028 2024-05-31 04:04:00,802 - mmdet - INFO - Epoch [4][4200/7330] lr: 1.000e-04, eta: 14:06:12, time: 0.817, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0514, loss_cls: 0.2122, acc: 92.3889, loss_bbox: 0.2639, loss_mask: 0.2564, loss: 0.8063 2024-05-31 04:04:41,604 - mmdet - INFO - Epoch [4][4250/7330] lr: 1.000e-04, eta: 14:05:30, time: 0.816, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0527, loss_cls: 0.2160, acc: 92.2073, loss_bbox: 0.2689, loss_mask: 0.2545, loss: 0.8166 2024-05-31 04:05:21,759 - mmdet - INFO - Epoch [4][4300/7330] lr: 1.000e-04, eta: 14:04:47, time: 0.803, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0472, loss_cls: 0.1992, acc: 92.8240, loss_bbox: 0.2512, loss_mask: 0.2551, loss: 0.7734 2024-05-31 04:06:04,682 - mmdet - INFO - Epoch [4][4350/7330] lr: 1.000e-04, eta: 14:04:10, time: 0.859, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0452, loss_cls: 0.2046, acc: 92.7534, loss_bbox: 0.2473, loss_mask: 0.2471, loss: 0.7667 2024-05-31 04:06:52,225 - mmdet - INFO - Epoch [4][4400/7330] lr: 1.000e-04, eta: 14:03:39, time: 0.902, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0493, loss_cls: 0.2028, acc: 92.6577, loss_bbox: 0.2572, loss_mask: 0.2514, loss: 0.7839 2024-05-31 04:07:37,211 - mmdet - INFO - Epoch [4][4450/7330] lr: 1.000e-04, eta: 14:03:12, time: 0.948, data_time: 0.106, memory: 18874, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0481, loss_cls: 0.2091, acc: 92.5281, loss_bbox: 0.2588, loss_mask: 0.2484, loss: 0.7868 2024-05-31 04:08:17,906 - mmdet - INFO - Epoch [4][4500/7330] lr: 1.000e-04, eta: 14:02:30, time: 0.814, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0474, loss_cls: 0.2059, acc: 92.6125, loss_bbox: 0.2547, loss_mask: 0.2481, loss: 0.7788 2024-05-31 04:08:58,199 - mmdet - INFO - Epoch [4][4550/7330] lr: 1.000e-04, eta: 14:01:47, time: 0.806, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0480, loss_cls: 0.2028, acc: 92.6731, loss_bbox: 0.2542, loss_mask: 0.2528, loss: 0.7813 2024-05-31 04:09:39,219 - mmdet - INFO - Epoch [4][4600/7330] lr: 1.000e-04, eta: 14:01:06, time: 0.820, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0480, loss_cls: 0.2066, acc: 92.6201, loss_bbox: 0.2582, loss_mask: 0.2474, loss: 0.7822 2024-05-31 04:10:21,622 - mmdet - INFO - Epoch [4][4650/7330] lr: 1.000e-04, eta: 14:00:27, time: 0.848, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0454, loss_cls: 0.1958, acc: 93.0222, loss_bbox: 0.2439, loss_mask: 0.2432, loss: 0.7488 2024-05-31 04:11:02,608 - mmdet - INFO - Epoch [4][4700/7330] lr: 1.000e-04, eta: 13:59:46, time: 0.820, data_time: 0.073, memory: 18874, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0529, loss_cls: 0.2105, acc: 92.5093, loss_bbox: 0.2582, loss_mask: 0.2524, loss: 0.7980 2024-05-31 04:11:45,364 - mmdet - INFO - Epoch [4][4750/7330] lr: 1.000e-04, eta: 13:59:09, time: 0.855, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0462, loss_cls: 0.1957, acc: 92.9546, loss_bbox: 0.2448, loss_mask: 0.2522, loss: 0.7607 2024-05-31 04:12:25,490 - mmdet - INFO - Epoch [4][4800/7330] lr: 1.000e-04, eta: 13:58:25, time: 0.803, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0502, loss_cls: 0.1955, acc: 92.9287, loss_bbox: 0.2498, loss_mask: 0.2476, loss: 0.7674 2024-05-31 04:13:05,606 - mmdet - INFO - Epoch [4][4850/7330] lr: 1.000e-04, eta: 13:57:42, time: 0.802, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0464, loss_cls: 0.1975, acc: 92.9167, loss_bbox: 0.2487, loss_mask: 0.2500, loss: 0.7633 2024-05-31 04:13:46,304 - mmdet - INFO - Epoch [4][4900/7330] lr: 1.000e-04, eta: 13:57:00, time: 0.813, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0511, loss_cls: 0.2093, acc: 92.5300, loss_bbox: 0.2637, loss_mask: 0.2570, loss: 0.8059 2024-05-31 04:14:26,602 - mmdet - INFO - Epoch [4][4950/7330] lr: 1.000e-04, eta: 13:56:17, time: 0.806, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0516, loss_cls: 0.2085, acc: 92.4421, loss_bbox: 0.2595, loss_mask: 0.2523, loss: 0.7950 2024-05-31 04:15:06,811 - mmdet - INFO - Epoch [4][5000/7330] lr: 1.000e-04, eta: 13:55:34, time: 0.804, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0456, loss_cls: 0.1974, acc: 92.9866, loss_bbox: 0.2454, loss_mask: 0.2543, loss: 0.7618 2024-05-31 04:15:48,008 - mmdet - INFO - Epoch [4][5050/7330] lr: 1.000e-04, eta: 13:54:53, time: 0.824, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0481, loss_cls: 0.1976, acc: 92.8506, loss_bbox: 0.2469, loss_mask: 0.2568, loss: 0.7702 2024-05-31 04:16:28,704 - mmdet - INFO - Epoch [4][5100/7330] lr: 1.000e-04, eta: 13:54:11, time: 0.814, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0485, loss_cls: 0.1997, acc: 92.7351, loss_bbox: 0.2578, loss_mask: 0.2579, loss: 0.7844 2024-05-31 04:17:09,518 - mmdet - INFO - Epoch [4][5150/7330] lr: 1.000e-04, eta: 13:53:29, time: 0.816, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0503, loss_cls: 0.2105, acc: 92.4187, loss_bbox: 0.2625, loss_mask: 0.2555, loss: 0.8029 2024-05-31 04:17:49,809 - mmdet - INFO - Epoch [4][5200/7330] lr: 1.000e-04, eta: 13:52:46, time: 0.806, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0480, loss_cls: 0.2060, acc: 92.6687, loss_bbox: 0.2554, loss_mask: 0.2495, loss: 0.7807 2024-05-31 04:18:30,552 - mmdet - INFO - Epoch [4][5250/7330] lr: 1.000e-04, eta: 13:52:04, time: 0.815, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0484, loss_cls: 0.2035, acc: 92.7271, loss_bbox: 0.2507, loss_mask: 0.2502, loss: 0.7738 2024-05-31 04:19:11,121 - mmdet - INFO - Epoch [4][5300/7330] lr: 1.000e-04, eta: 13:51:22, time: 0.811, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0479, loss_cls: 0.2009, acc: 92.7590, loss_bbox: 0.2539, loss_mask: 0.2520, loss: 0.7774 2024-05-31 04:19:51,401 - mmdet - INFO - Epoch [4][5350/7330] lr: 1.000e-04, eta: 13:50:39, time: 0.806, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0499, loss_cls: 0.2001, acc: 92.8970, loss_bbox: 0.2479, loss_mask: 0.2495, loss: 0.7708 2024-05-31 04:20:34,076 - mmdet - INFO - Epoch [4][5400/7330] lr: 1.000e-04, eta: 13:50:01, time: 0.853, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0500, loss_cls: 0.2004, acc: 92.7759, loss_bbox: 0.2541, loss_mask: 0.2534, loss: 0.7816 2024-05-31 04:21:20,342 - mmdet - INFO - Epoch [4][5450/7330] lr: 1.000e-04, eta: 13:49:31, time: 0.925, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0510, loss_cls: 0.2094, acc: 92.4722, loss_bbox: 0.2605, loss_mask: 0.2537, loss: 0.7986 2024-05-31 04:22:06,768 - mmdet - INFO - Epoch [4][5500/7330] lr: 1.000e-04, eta: 13:49:02, time: 0.929, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0462, loss_cls: 0.2002, acc: 92.8274, loss_bbox: 0.2473, loss_mask: 0.2485, loss: 0.7633 2024-05-31 04:22:49,400 - mmdet - INFO - Epoch [4][5550/7330] lr: 1.000e-04, eta: 13:48:24, time: 0.853, data_time: 0.034, memory: 18874, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0459, loss_cls: 0.1975, acc: 92.9224, loss_bbox: 0.2466, loss_mask: 0.2405, loss: 0.7539 2024-05-31 04:23:29,905 - mmdet - INFO - Epoch [4][5600/7330] lr: 1.000e-04, eta: 13:47:42, time: 0.810, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0466, loss_cls: 0.1994, acc: 93.0303, loss_bbox: 0.2373, loss_mask: 0.2460, loss: 0.7528 2024-05-31 04:24:11,170 - mmdet - INFO - Epoch [4][5650/7330] lr: 1.000e-04, eta: 13:47:01, time: 0.825, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0511, loss_cls: 0.2089, acc: 92.4436, loss_bbox: 0.2595, loss_mask: 0.2544, loss: 0.7978 2024-05-31 04:24:51,556 - mmdet - INFO - Epoch [4][5700/7330] lr: 1.000e-04, eta: 13:46:18, time: 0.808, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0511, loss_cls: 0.2157, acc: 92.2842, loss_bbox: 0.2640, loss_mask: 0.2524, loss: 0.8072 2024-05-31 04:25:33,638 - mmdet - INFO - Epoch [4][5750/7330] lr: 1.000e-04, eta: 13:45:39, time: 0.842, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0502, loss_cls: 0.2049, acc: 92.7112, loss_bbox: 0.2517, loss_mask: 0.2505, loss: 0.7794 2024-05-31 04:26:14,140 - mmdet - INFO - Epoch [4][5800/7330] lr: 1.000e-04, eta: 13:44:56, time: 0.810, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0492, loss_cls: 0.2036, acc: 92.6528, loss_bbox: 0.2558, loss_mask: 0.2492, loss: 0.7791 2024-05-31 04:26:57,071 - mmdet - INFO - Epoch [4][5850/7330] lr: 1.000e-04, eta: 13:44:19, time: 0.859, data_time: 0.075, memory: 18874, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0464, loss_cls: 0.2062, acc: 92.6125, loss_bbox: 0.2543, loss_mask: 0.2515, loss: 0.7795 2024-05-31 04:27:37,685 - mmdet - INFO - Epoch [4][5900/7330] lr: 1.000e-04, eta: 13:43:37, time: 0.812, data_time: 0.078, memory: 18874, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0474, loss_cls: 0.1961, acc: 93.0125, loss_bbox: 0.2424, loss_mask: 0.2508, loss: 0.7589 2024-05-31 04:28:17,826 - mmdet - INFO - Epoch [4][5950/7330] lr: 1.000e-04, eta: 13:42:54, time: 0.803, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0504, loss_cls: 0.2120, acc: 92.4822, loss_bbox: 0.2598, loss_mask: 0.2567, loss: 0.8028 2024-05-31 04:28:58,354 - mmdet - INFO - Epoch [4][6000/7330] lr: 1.000e-04, eta: 13:42:11, time: 0.810, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0489, loss_cls: 0.2003, acc: 92.8484, loss_bbox: 0.2513, loss_mask: 0.2494, loss: 0.7743 2024-05-31 04:29:39,369 - mmdet - INFO - Epoch [4][6050/7330] lr: 1.000e-04, eta: 13:41:30, time: 0.820, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0523, loss_cls: 0.2108, acc: 92.4495, loss_bbox: 0.2603, loss_mask: 0.2548, loss: 0.8017 2024-05-31 04:30:19,796 - mmdet - INFO - Epoch [4][6100/7330] lr: 1.000e-04, eta: 13:40:47, time: 0.809, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0466, loss_cls: 0.2004, acc: 92.8140, loss_bbox: 0.2517, loss_mask: 0.2431, loss: 0.7619 2024-05-31 04:31:00,480 - mmdet - INFO - Epoch [4][6150/7330] lr: 1.000e-04, eta: 13:40:05, time: 0.814, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0517, loss_cls: 0.2129, acc: 92.3223, loss_bbox: 0.2605, loss_mask: 0.2512, loss: 0.7992 2024-05-31 04:31:41,528 - mmdet - INFO - Epoch [4][6200/7330] lr: 1.000e-04, eta: 13:39:24, time: 0.821, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0489, loss_cls: 0.2001, acc: 92.8032, loss_bbox: 0.2534, loss_mask: 0.2475, loss: 0.7718 2024-05-31 04:32:21,649 - mmdet - INFO - Epoch [4][6250/7330] lr: 1.000e-04, eta: 13:38:41, time: 0.802, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0479, loss_cls: 0.2004, acc: 92.7292, loss_bbox: 0.2489, loss_mask: 0.2443, loss: 0.7640 2024-05-31 04:33:01,578 - mmdet - INFO - Epoch [4][6300/7330] lr: 1.000e-04, eta: 13:37:57, time: 0.799, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0442, loss_cls: 0.1964, acc: 92.9666, loss_bbox: 0.2437, loss_mask: 0.2447, loss: 0.7492 2024-05-31 04:33:41,971 - mmdet - INFO - Epoch [4][6350/7330] lr: 1.000e-04, eta: 13:37:14, time: 0.808, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0460, loss_cls: 0.2034, acc: 92.5808, loss_bbox: 0.2556, loss_mask: 0.2512, loss: 0.7786 2024-05-31 04:34:22,342 - mmdet - INFO - Epoch [4][6400/7330] lr: 1.000e-04, eta: 13:36:32, time: 0.807, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0457, loss_cls: 0.1929, acc: 93.0266, loss_bbox: 0.2460, loss_mask: 0.2475, loss: 0.7532 2024-05-31 04:35:06,186 - mmdet - INFO - Epoch [4][6450/7330] lr: 1.000e-04, eta: 13:35:56, time: 0.877, data_time: 0.074, memory: 18874, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0476, loss_cls: 0.2091, acc: 92.5100, loss_bbox: 0.2525, loss_mask: 0.2480, loss: 0.7792 2024-05-31 04:35:48,807 - mmdet - INFO - Epoch [4][6500/7330] lr: 1.000e-04, eta: 13:35:18, time: 0.852, data_time: 0.073, memory: 18874, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0484, loss_cls: 0.2017, acc: 92.8384, loss_bbox: 0.2483, loss_mask: 0.2498, loss: 0.7718 2024-05-31 04:36:33,979 - mmdet - INFO - Epoch [4][6550/7330] lr: 1.000e-04, eta: 13:34:45, time: 0.903, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0456, loss_cls: 0.2055, acc: 92.5989, loss_bbox: 0.2517, loss_mask: 0.2520, loss: 0.7774 2024-05-31 04:37:19,483 - mmdet - INFO - Epoch [4][6600/7330] lr: 1.000e-04, eta: 13:34:13, time: 0.910, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0473, loss_cls: 0.1951, acc: 92.9236, loss_bbox: 0.2471, loss_mask: 0.2432, loss: 0.7547 2024-05-31 04:38:00,294 - mmdet - INFO - Epoch [4][6650/7330] lr: 1.000e-04, eta: 13:33:32, time: 0.816, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0506, loss_cls: 0.2096, acc: 92.4099, loss_bbox: 0.2594, loss_mask: 0.2480, loss: 0.7894 2024-05-31 04:38:41,089 - mmdet - INFO - Epoch [4][6700/7330] lr: 1.000e-04, eta: 13:32:50, time: 0.816, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0477, loss_cls: 0.2004, acc: 92.8381, loss_bbox: 0.2455, loss_mask: 0.2489, loss: 0.7650 2024-05-31 04:39:21,625 - mmdet - INFO - Epoch [4][6750/7330] lr: 1.000e-04, eta: 13:32:07, time: 0.811, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0458, loss_cls: 0.2020, acc: 92.6343, loss_bbox: 0.2558, loss_mask: 0.2497, loss: 0.7741 2024-05-31 04:40:04,413 - mmdet - INFO - Epoch [4][6800/7330] lr: 1.000e-04, eta: 13:31:30, time: 0.856, data_time: 0.085, memory: 18874, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0493, loss_cls: 0.2039, acc: 92.6399, loss_bbox: 0.2508, loss_mask: 0.2506, loss: 0.7773 2024-05-31 04:40:44,525 - mmdet - INFO - Epoch [4][6850/7330] lr: 1.000e-04, eta: 13:30:46, time: 0.802, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0521, loss_cls: 0.2142, acc: 92.3965, loss_bbox: 0.2622, loss_mask: 0.2554, loss: 0.8084 2024-05-31 04:41:27,101 - mmdet - INFO - Epoch [4][6900/7330] lr: 1.000e-04, eta: 13:30:08, time: 0.852, data_time: 0.074, memory: 18874, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0467, loss_cls: 0.1977, acc: 92.8064, loss_bbox: 0.2422, loss_mask: 0.2431, loss: 0.7522 2024-05-31 04:42:07,898 - mmdet - INFO - Epoch [4][6950/7330] lr: 1.000e-04, eta: 13:29:26, time: 0.816, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0496, loss_cls: 0.2083, acc: 92.5645, loss_bbox: 0.2545, loss_mask: 0.2556, loss: 0.7918 2024-05-31 04:42:47,999 - mmdet - INFO - Epoch [4][7000/7330] lr: 1.000e-04, eta: 13:28:43, time: 0.802, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0484, loss_cls: 0.2043, acc: 92.6792, loss_bbox: 0.2492, loss_mask: 0.2451, loss: 0.7694 2024-05-31 04:43:28,402 - mmdet - INFO - Epoch [4][7050/7330] lr: 1.000e-04, eta: 13:28:00, time: 0.808, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0461, loss_cls: 0.2059, acc: 92.5479, loss_bbox: 0.2556, loss_mask: 0.2523, loss: 0.7809 2024-05-31 04:44:09,113 - mmdet - INFO - Epoch [4][7100/7330] lr: 1.000e-04, eta: 13:27:18, time: 0.814, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0478, loss_cls: 0.2019, acc: 92.7214, loss_bbox: 0.2529, loss_mask: 0.2445, loss: 0.7682 2024-05-31 04:44:49,810 - mmdet - INFO - Epoch [4][7150/7330] lr: 1.000e-04, eta: 13:26:36, time: 0.814, data_time: 0.072, memory: 18874, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0460, loss_cls: 0.2110, acc: 92.3992, loss_bbox: 0.2568, loss_mask: 0.2516, loss: 0.7889 2024-05-31 04:45:31,123 - mmdet - INFO - Epoch [4][7200/7330] lr: 1.000e-04, eta: 13:25:55, time: 0.826, data_time: 0.075, memory: 18874, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0475, loss_cls: 0.2070, acc: 92.4790, loss_bbox: 0.2531, loss_mask: 0.2544, loss: 0.7840 2024-05-31 04:46:11,587 - mmdet - INFO - Epoch [4][7250/7330] lr: 1.000e-04, eta: 13:25:13, time: 0.809, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0451, loss_cls: 0.1931, acc: 93.0122, loss_bbox: 0.2447, loss_mask: 0.2420, loss: 0.7455 2024-05-31 04:46:52,355 - mmdet - INFO - Epoch [4][7300/7330] lr: 1.000e-04, eta: 13:24:31, time: 0.815, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0473, loss_cls: 0.2023, acc: 92.7751, loss_bbox: 0.2479, loss_mask: 0.2490, loss: 0.7682 2024-05-31 04:47:17,664 - mmdet - INFO - Saving checkpoint at 4 epochs 2024-05-31 04:49:06,234 - mmdet - INFO - Evaluating bbox... 2024-05-31 04:49:30,179 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.436 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.675 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.480 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.262 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.485 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.601 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.557 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.557 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.557 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.357 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.609 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.726 2024-05-31 04:49:30,179 - mmdet - INFO - Evaluating segm... 2024-05-31 04:49:54,829 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.398 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.642 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.424 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.181 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.436 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.619 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.510 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.510 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.510 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.294 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.562 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.706 2024-05-31 04:49:55,158 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 04:49:55,159 - mmdet - INFO - Epoch(val) [4][625] bbox_mAP: 0.4360, bbox_mAP_50: 0.6750, bbox_mAP_75: 0.4800, bbox_mAP_s: 0.2620, bbox_mAP_m: 0.4850, bbox_mAP_l: 0.6010, bbox_mAP_copypaste: 0.436 0.675 0.480 0.262 0.485 0.601, segm_mAP: 0.3980, segm_mAP_50: 0.6420, segm_mAP_75: 0.4240, segm_mAP_s: 0.1810, segm_mAP_m: 0.4360, segm_mAP_l: 0.6190, segm_mAP_copypaste: 0.398 0.642 0.424 0.181 0.436 0.619 2024-05-31 04:50:48,550 - mmdet - INFO - Epoch [5][50/7330] lr: 1.000e-04, eta: 13:23:00, time: 1.067, data_time: 0.125, memory: 18874, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0463, loss_cls: 0.1849, acc: 93.2854, loss_bbox: 0.2372, loss_mask: 0.2391, loss: 0.7263 2024-05-31 04:51:32,535 - mmdet - INFO - Epoch [5][100/7330] lr: 1.000e-04, eta: 13:22:25, time: 0.880, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0448, loss_cls: 0.1917, acc: 93.0486, loss_bbox: 0.2409, loss_mask: 0.2457, loss: 0.7436 2024-05-31 04:52:18,179 - mmdet - INFO - Epoch [5][150/7330] lr: 1.000e-04, eta: 13:21:53, time: 0.913, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0485, loss_cls: 0.1947, acc: 92.9246, loss_bbox: 0.2455, loss_mask: 0.2440, loss: 0.7547 2024-05-31 04:52:58,828 - mmdet - INFO - Epoch [5][200/7330] lr: 1.000e-04, eta: 13:21:11, time: 0.813, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0446, loss_cls: 0.1835, acc: 93.3733, loss_bbox: 0.2293, loss_mask: 0.2422, loss: 0.7192 2024-05-31 04:53:39,790 - mmdet - INFO - Epoch [5][250/7330] lr: 1.000e-04, eta: 13:20:29, time: 0.819, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0445, loss_cls: 0.1869, acc: 93.0571, loss_bbox: 0.2430, loss_mask: 0.2463, loss: 0.7382 2024-05-31 04:54:21,313 - mmdet - INFO - Epoch [5][300/7330] lr: 1.000e-04, eta: 13:19:49, time: 0.830, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0474, loss_cls: 0.1978, acc: 92.8757, loss_bbox: 0.2491, loss_mask: 0.2449, loss: 0.7580 2024-05-31 04:55:01,630 - mmdet - INFO - Epoch [5][350/7330] lr: 1.000e-04, eta: 13:19:06, time: 0.806, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0451, loss_cls: 0.1901, acc: 93.1030, loss_bbox: 0.2392, loss_mask: 0.2447, loss: 0.7389 2024-05-31 04:55:42,202 - mmdet - INFO - Epoch [5][400/7330] lr: 1.000e-04, eta: 13:18:24, time: 0.811, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0475, loss_cls: 0.1963, acc: 92.8921, loss_bbox: 0.2514, loss_mask: 0.2450, loss: 0.7599 2024-05-31 04:56:22,736 - mmdet - INFO - Epoch [5][450/7330] lr: 1.000e-04, eta: 13:17:42, time: 0.811, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0468, loss_cls: 0.1929, acc: 92.9194, loss_bbox: 0.2480, loss_mask: 0.2474, loss: 0.7551 2024-05-31 04:57:03,335 - mmdet - INFO - Epoch [5][500/7330] lr: 1.000e-04, eta: 13:17:00, time: 0.812, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0474, loss_cls: 0.1943, acc: 92.8604, loss_bbox: 0.2497, loss_mask: 0.2522, loss: 0.7636 2024-05-31 04:57:43,965 - mmdet - INFO - Epoch [5][550/7330] lr: 1.000e-04, eta: 13:16:17, time: 0.813, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0438, loss_cls: 0.1886, acc: 93.1072, loss_bbox: 0.2435, loss_mask: 0.2483, loss: 0.7441 2024-05-31 04:58:24,304 - mmdet - INFO - Epoch [5][600/7330] lr: 1.000e-04, eta: 13:15:35, time: 0.807, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0456, loss_cls: 0.1872, acc: 93.0894, loss_bbox: 0.2452, loss_mask: 0.2446, loss: 0.7404 2024-05-31 04:59:07,787 - mmdet - INFO - Epoch [5][650/7330] lr: 1.000e-04, eta: 13:14:58, time: 0.870, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0483, loss_cls: 0.1971, acc: 92.8184, loss_bbox: 0.2531, loss_mask: 0.2506, loss: 0.7718 2024-05-31 04:59:47,956 - mmdet - INFO - Epoch [5][700/7330] lr: 1.000e-04, eta: 13:14:15, time: 0.803, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0469, loss_cls: 0.1907, acc: 93.0801, loss_bbox: 0.2426, loss_mask: 0.2430, loss: 0.7427 2024-05-31 05:00:28,983 - mmdet - INFO - Epoch [5][750/7330] lr: 1.000e-04, eta: 13:13:34, time: 0.821, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0455, loss_cls: 0.1982, acc: 92.7852, loss_bbox: 0.2535, loss_mask: 0.2448, loss: 0.7626 2024-05-31 05:01:10,473 - mmdet - INFO - Epoch [5][800/7330] lr: 1.000e-04, eta: 13:12:54, time: 0.830, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0515, loss_cls: 0.2079, acc: 92.4485, loss_bbox: 0.2639, loss_mask: 0.2519, loss: 0.7976 2024-05-31 05:01:51,238 - mmdet - INFO - Epoch [5][850/7330] lr: 1.000e-04, eta: 13:12:12, time: 0.815, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0502, loss_cls: 0.2042, acc: 92.5239, loss_bbox: 0.2584, loss_mask: 0.2472, loss: 0.7810 2024-05-31 05:02:32,561 - mmdet - INFO - Epoch [5][900/7330] lr: 1.000e-04, eta: 13:11:31, time: 0.826, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0496, loss_cls: 0.2075, acc: 92.4548, loss_bbox: 0.2621, loss_mask: 0.2500, loss: 0.7929 2024-05-31 05:03:16,047 - mmdet - INFO - Epoch [5][950/7330] lr: 1.000e-04, eta: 13:10:54, time: 0.870, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0496, loss_cls: 0.1995, acc: 92.7446, loss_bbox: 0.2540, loss_mask: 0.2474, loss: 0.7722 2024-05-31 05:03:56,430 - mmdet - INFO - Epoch [5][1000/7330] lr: 1.000e-04, eta: 13:10:12, time: 0.808, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0447, loss_cls: 0.1884, acc: 93.1497, loss_bbox: 0.2396, loss_mask: 0.2383, loss: 0.7300 2024-05-31 05:04:39,896 - mmdet - INFO - Epoch [5][1050/7330] lr: 1.000e-04, eta: 13:09:35, time: 0.869, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0452, loss_cls: 0.2010, acc: 92.7141, loss_bbox: 0.2545, loss_mask: 0.2465, loss: 0.7655 2024-05-31 05:05:22,903 - mmdet - INFO - Epoch [5][1100/7330] lr: 1.000e-04, eta: 13:08:58, time: 0.860, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0447, loss_cls: 0.1918, acc: 93.0413, loss_bbox: 0.2439, loss_mask: 0.2379, loss: 0.7371 2024-05-31 05:06:06,424 - mmdet - INFO - Epoch [5][1150/7330] lr: 1.000e-04, eta: 13:08:21, time: 0.870, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0467, loss_cls: 0.1999, acc: 92.8374, loss_bbox: 0.2511, loss_mask: 0.2465, loss: 0.7640 2024-05-31 05:06:48,955 - mmdet - INFO - Epoch [5][1200/7330] lr: 1.000e-04, eta: 13:07:42, time: 0.851, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0448, loss_cls: 0.1892, acc: 93.2000, loss_bbox: 0.2404, loss_mask: 0.2425, loss: 0.7369 2024-05-31 05:07:32,618 - mmdet - INFO - Epoch [5][1250/7330] lr: 1.000e-04, eta: 13:07:06, time: 0.873, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0471, loss_cls: 0.1981, acc: 92.7949, loss_bbox: 0.2486, loss_mask: 0.2468, loss: 0.7603 2024-05-31 05:08:13,733 - mmdet - INFO - Epoch [5][1300/7330] lr: 1.000e-04, eta: 13:06:25, time: 0.822, data_time: 0.076, memory: 18874, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0464, loss_cls: 0.1967, acc: 92.7705, loss_bbox: 0.2510, loss_mask: 0.2457, loss: 0.7606 2024-05-31 05:08:56,588 - mmdet - INFO - Epoch [5][1350/7330] lr: 1.000e-04, eta: 13:05:47, time: 0.857, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0477, loss_cls: 0.1850, acc: 93.3308, loss_bbox: 0.2358, loss_mask: 0.2435, loss: 0.7330 2024-05-31 05:09:37,195 - mmdet - INFO - Epoch [5][1400/7330] lr: 1.000e-04, eta: 13:05:05, time: 0.812, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0457, loss_cls: 0.1878, acc: 93.2952, loss_bbox: 0.2360, loss_mask: 0.2372, loss: 0.7257 2024-05-31 05:10:17,118 - mmdet - INFO - Epoch [5][1450/7330] lr: 1.000e-04, eta: 13:04:21, time: 0.799, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0445, loss_cls: 0.1852, acc: 93.1492, loss_bbox: 0.2379, loss_mask: 0.2364, loss: 0.7225 2024-05-31 05:10:58,041 - mmdet - INFO - Epoch [5][1500/7330] lr: 1.000e-04, eta: 13:03:40, time: 0.818, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0472, loss_cls: 0.1950, acc: 92.9172, loss_bbox: 0.2477, loss_mask: 0.2451, loss: 0.7549 2024-05-31 05:11:38,520 - mmdet - INFO - Epoch [5][1550/7330] lr: 1.000e-04, eta: 13:02:57, time: 0.810, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0463, loss_cls: 0.1992, acc: 92.8442, loss_bbox: 0.2495, loss_mask: 0.2468, loss: 0.7629 2024-05-31 05:12:19,270 - mmdet - INFO - Epoch [5][1600/7330] lr: 1.000e-04, eta: 13:02:15, time: 0.815, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0462, loss_cls: 0.1938, acc: 92.9424, loss_bbox: 0.2429, loss_mask: 0.2445, loss: 0.7477 2024-05-31 05:13:00,014 - mmdet - INFO - Epoch [5][1650/7330] lr: 1.000e-04, eta: 13:01:34, time: 0.815, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0464, loss_cls: 0.1924, acc: 93.0745, loss_bbox: 0.2458, loss_mask: 0.2447, loss: 0.7486 2024-05-31 05:13:45,226 - mmdet - INFO - Epoch [5][1700/7330] lr: 1.000e-04, eta: 13:01:00, time: 0.904, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0459, loss_cls: 0.1908, acc: 93.1853, loss_bbox: 0.2374, loss_mask: 0.2344, loss: 0.7296 2024-05-31 05:14:26,616 - mmdet - INFO - Epoch [5][1750/7330] lr: 1.000e-04, eta: 13:00:19, time: 0.828, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0470, loss_cls: 0.1909, acc: 93.1565, loss_bbox: 0.2396, loss_mask: 0.2404, loss: 0.7382 2024-05-31 05:15:07,506 - mmdet - INFO - Epoch [5][1800/7330] lr: 1.000e-04, eta: 12:59:38, time: 0.818, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0484, loss_cls: 0.1952, acc: 92.9666, loss_bbox: 0.2441, loss_mask: 0.2457, loss: 0.7541 2024-05-31 05:15:48,349 - mmdet - INFO - Epoch [5][1850/7330] lr: 1.000e-04, eta: 12:58:56, time: 0.817, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0482, loss_cls: 0.1993, acc: 92.7971, loss_bbox: 0.2502, loss_mask: 0.2450, loss: 0.7636 2024-05-31 05:16:28,813 - mmdet - INFO - Epoch [5][1900/7330] lr: 1.000e-04, eta: 12:58:13, time: 0.809, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0488, loss_cls: 0.1967, acc: 92.8713, loss_bbox: 0.2426, loss_mask: 0.2473, loss: 0.7572 2024-05-31 05:17:08,939 - mmdet - INFO - Epoch [5][1950/7330] lr: 1.000e-04, eta: 12:57:30, time: 0.802, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0434, loss_cls: 0.1851, acc: 93.4402, loss_bbox: 0.2325, loss_mask: 0.2400, loss: 0.7208 2024-05-31 05:17:52,572 - mmdet - INFO - Epoch [5][2000/7330] lr: 1.000e-04, eta: 12:56:54, time: 0.873, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0474, loss_cls: 0.1893, acc: 93.0139, loss_bbox: 0.2473, loss_mask: 0.2448, loss: 0.7490 2024-05-31 05:18:33,100 - mmdet - INFO - Epoch [5][2050/7330] lr: 1.000e-04, eta: 12:56:12, time: 0.810, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0471, loss_cls: 0.2018, acc: 92.5605, loss_bbox: 0.2505, loss_mask: 0.2399, loss: 0.7598 2024-05-31 05:19:16,674 - mmdet - INFO - Epoch [5][2100/7330] lr: 1.000e-04, eta: 12:55:35, time: 0.872, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0499, loss_cls: 0.2075, acc: 92.4192, loss_bbox: 0.2612, loss_mask: 0.2526, loss: 0.7929 2024-05-31 05:19:59,987 - mmdet - INFO - Epoch [5][2150/7330] lr: 1.000e-04, eta: 12:54:57, time: 0.866, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0474, loss_cls: 0.2004, acc: 92.7246, loss_bbox: 0.2544, loss_mask: 0.2497, loss: 0.7709 2024-05-31 05:20:43,723 - mmdet - INFO - Epoch [5][2200/7330] lr: 1.000e-04, eta: 12:54:21, time: 0.875, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0446, loss_cls: 0.1889, acc: 93.1106, loss_bbox: 0.2392, loss_mask: 0.2419, loss: 0.7339 2024-05-31 05:21:26,623 - mmdet - INFO - Epoch [5][2250/7330] lr: 1.000e-04, eta: 12:53:43, time: 0.858, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0450, loss_cls: 0.1921, acc: 93.0200, loss_bbox: 0.2436, loss_mask: 0.2435, loss: 0.7442 2024-05-31 05:22:09,805 - mmdet - INFO - Epoch [5][2300/7330] lr: 1.000e-04, eta: 12:53:05, time: 0.864, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0439, loss_cls: 0.1948, acc: 92.9614, loss_bbox: 0.2476, loss_mask: 0.2440, loss: 0.7507 2024-05-31 05:22:51,260 - mmdet - INFO - Epoch [5][2350/7330] lr: 1.000e-04, eta: 12:52:25, time: 0.829, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0495, loss_cls: 0.2038, acc: 92.4856, loss_bbox: 0.2550, loss_mask: 0.2454, loss: 0.7761 2024-05-31 05:23:33,605 - mmdet - INFO - Epoch [5][2400/7330] lr: 1.000e-04, eta: 12:51:46, time: 0.847, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0445, loss_cls: 0.2018, acc: 92.4509, loss_bbox: 0.2535, loss_mask: 0.2428, loss: 0.7614 2024-05-31 05:24:14,505 - mmdet - INFO - Epoch [5][2450/7330] lr: 1.000e-04, eta: 12:51:04, time: 0.818, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0440, loss_cls: 0.1915, acc: 92.9421, loss_bbox: 0.2480, loss_mask: 0.2436, loss: 0.7462 2024-05-31 05:24:54,831 - mmdet - INFO - Epoch [5][2500/7330] lr: 1.000e-04, eta: 12:50:21, time: 0.807, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0439, loss_cls: 0.1935, acc: 92.9614, loss_bbox: 0.2424, loss_mask: 0.2421, loss: 0.7416 2024-05-31 05:25:34,905 - mmdet - INFO - Epoch [5][2550/7330] lr: 1.000e-04, eta: 12:49:38, time: 0.801, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0440, loss_cls: 0.1841, acc: 93.2273, loss_bbox: 0.2351, loss_mask: 0.2409, loss: 0.7222 2024-05-31 05:26:16,091 - mmdet - INFO - Epoch [5][2600/7330] lr: 1.000e-04, eta: 12:48:57, time: 0.824, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0493, loss_cls: 0.1973, acc: 92.8545, loss_bbox: 0.2529, loss_mask: 0.2473, loss: 0.7677 2024-05-31 05:26:57,176 - mmdet - INFO - Epoch [5][2650/7330] lr: 1.000e-04, eta: 12:48:16, time: 0.822, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0456, loss_cls: 0.1989, acc: 92.7422, loss_bbox: 0.2510, loss_mask: 0.2471, loss: 0.7612 2024-05-31 05:27:38,169 - mmdet - INFO - Epoch [5][2700/7330] lr: 1.000e-04, eta: 12:47:34, time: 0.820, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0447, loss_cls: 0.1851, acc: 93.3369, loss_bbox: 0.2329, loss_mask: 0.2407, loss: 0.7214 2024-05-31 05:28:21,016 - mmdet - INFO - Epoch [5][2750/7330] lr: 1.000e-04, eta: 12:46:56, time: 0.857, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0443, loss_cls: 0.1936, acc: 92.9285, loss_bbox: 0.2441, loss_mask: 0.2424, loss: 0.7454 2024-05-31 05:29:01,552 - mmdet - INFO - Epoch [5][2800/7330] lr: 1.000e-04, eta: 12:46:14, time: 0.811, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0443, loss_cls: 0.1888, acc: 93.0840, loss_bbox: 0.2415, loss_mask: 0.2354, loss: 0.7297 2024-05-31 05:29:41,807 - mmdet - INFO - Epoch [5][2850/7330] lr: 1.000e-04, eta: 12:45:31, time: 0.805, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0456, loss_cls: 0.1899, acc: 93.1860, loss_bbox: 0.2376, loss_mask: 0.2445, loss: 0.7374 2024-05-31 05:30:22,399 - mmdet - INFO - Epoch [5][2900/7330] lr: 1.000e-04, eta: 12:44:49, time: 0.812, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0452, loss_cls: 0.1990, acc: 92.8174, loss_bbox: 0.2481, loss_mask: 0.2480, loss: 0.7611 2024-05-31 05:31:03,695 - mmdet - INFO - Epoch [5][2950/7330] lr: 1.000e-04, eta: 12:44:08, time: 0.826, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0470, loss_cls: 0.1997, acc: 92.7715, loss_bbox: 0.2453, loss_mask: 0.2440, loss: 0.7559 2024-05-31 05:31:44,129 - mmdet - INFO - Epoch [5][3000/7330] lr: 1.000e-04, eta: 12:43:26, time: 0.809, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0480, loss_cls: 0.1919, acc: 93.0315, loss_bbox: 0.2441, loss_mask: 0.2481, loss: 0.7522 2024-05-31 05:32:26,979 - mmdet - INFO - Epoch [5][3050/7330] lr: 1.000e-04, eta: 12:42:47, time: 0.857, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0473, loss_cls: 0.1865, acc: 93.1125, loss_bbox: 0.2397, loss_mask: 0.2434, loss: 0.7372 2024-05-31 05:33:08,415 - mmdet - INFO - Epoch [5][3100/7330] lr: 1.000e-04, eta: 12:42:07, time: 0.829, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0478, loss_cls: 0.1923, acc: 93.0427, loss_bbox: 0.2464, loss_mask: 0.2485, loss: 0.7549 2024-05-31 05:33:51,377 - mmdet - INFO - Epoch [5][3150/7330] lr: 1.000e-04, eta: 12:41:28, time: 0.859, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0487, loss_cls: 0.1898, acc: 93.1973, loss_bbox: 0.2361, loss_mask: 0.2399, loss: 0.7356 2024-05-31 05:34:33,913 - mmdet - INFO - Epoch [5][3200/7330] lr: 1.000e-04, eta: 12:40:50, time: 0.851, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0421, loss_cls: 0.1893, acc: 93.1318, loss_bbox: 0.2411, loss_mask: 0.2464, loss: 0.7380 2024-05-31 05:35:18,441 - mmdet - INFO - Epoch [5][3250/7330] lr: 1.000e-04, eta: 12:40:14, time: 0.891, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0493, loss_cls: 0.1990, acc: 92.7083, loss_bbox: 0.2490, loss_mask: 0.2495, loss: 0.7701 2024-05-31 05:36:01,782 - mmdet - INFO - Epoch [5][3300/7330] lr: 1.000e-04, eta: 12:39:37, time: 0.867, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0477, loss_cls: 0.1970, acc: 92.9475, loss_bbox: 0.2494, loss_mask: 0.2457, loss: 0.7601 2024-05-31 05:36:43,713 - mmdet - INFO - Epoch [5][3350/7330] lr: 1.000e-04, eta: 12:38:57, time: 0.839, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0419, loss_cls: 0.1742, acc: 93.7151, loss_bbox: 0.2204, loss_mask: 0.2380, loss: 0.6920 2024-05-31 05:37:24,735 - mmdet - INFO - Epoch [5][3400/7330] lr: 1.000e-04, eta: 12:38:15, time: 0.820, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0491, loss_cls: 0.1931, acc: 92.9399, loss_bbox: 0.2486, loss_mask: 0.2460, loss: 0.7565 2024-05-31 05:38:07,375 - mmdet - INFO - Epoch [5][3450/7330] lr: 1.000e-04, eta: 12:37:37, time: 0.853, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0431, loss_cls: 0.1796, acc: 93.4097, loss_bbox: 0.2337, loss_mask: 0.2440, loss: 0.7182 2024-05-31 05:38:48,128 - mmdet - INFO - Epoch [5][3500/7330] lr: 1.000e-04, eta: 12:36:55, time: 0.815, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0445, loss_cls: 0.1888, acc: 93.1890, loss_bbox: 0.2357, loss_mask: 0.2453, loss: 0.7336 2024-05-31 05:39:28,990 - mmdet - INFO - Epoch [5][3550/7330] lr: 1.000e-04, eta: 12:36:13, time: 0.817, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0489, loss_cls: 0.2001, acc: 92.6301, loss_bbox: 0.2562, loss_mask: 0.2557, loss: 0.7837 2024-05-31 05:40:10,261 - mmdet - INFO - Epoch [5][3600/7330] lr: 1.000e-04, eta: 12:35:32, time: 0.825, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0466, loss_cls: 0.1952, acc: 92.8032, loss_bbox: 0.2489, loss_mask: 0.2437, loss: 0.7542 2024-05-31 05:40:50,488 - mmdet - INFO - Epoch [5][3650/7330] lr: 1.000e-04, eta: 12:34:49, time: 0.805, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0435, loss_cls: 0.1863, acc: 93.1362, loss_bbox: 0.2400, loss_mask: 0.2448, loss: 0.7337 2024-05-31 05:41:31,507 - mmdet - INFO - Epoch [5][3700/7330] lr: 1.000e-04, eta: 12:34:08, time: 0.820, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0466, loss_cls: 0.1941, acc: 92.9624, loss_bbox: 0.2425, loss_mask: 0.2478, loss: 0.7509 2024-05-31 05:42:11,882 - mmdet - INFO - Epoch [5][3750/7330] lr: 1.000e-04, eta: 12:33:25, time: 0.808, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0468, loss_cls: 0.1933, acc: 92.9497, loss_bbox: 0.2416, loss_mask: 0.2421, loss: 0.7474 2024-05-31 05:42:52,938 - mmdet - INFO - Epoch [5][3800/7330] lr: 1.000e-04, eta: 12:32:44, time: 0.821, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0475, loss_cls: 0.1914, acc: 93.0842, loss_bbox: 0.2396, loss_mask: 0.2398, loss: 0.7383 2024-05-31 05:43:35,712 - mmdet - INFO - Epoch [5][3850/7330] lr: 1.000e-04, eta: 12:32:05, time: 0.855, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0485, loss_cls: 0.1906, acc: 93.0847, loss_bbox: 0.2395, loss_mask: 0.2473, loss: 0.7453 2024-05-31 05:44:16,335 - mmdet - INFO - Epoch [5][3900/7330] lr: 1.000e-04, eta: 12:31:23, time: 0.812, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0457, loss_cls: 0.2019, acc: 92.6565, loss_bbox: 0.2533, loss_mask: 0.2448, loss: 0.7655 2024-05-31 05:44:56,884 - mmdet - INFO - Epoch [5][3950/7330] lr: 1.000e-04, eta: 12:30:41, time: 0.811, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0450, loss_cls: 0.1960, acc: 92.8010, loss_bbox: 0.2491, loss_mask: 0.2433, loss: 0.7543 2024-05-31 05:45:37,976 - mmdet - INFO - Epoch [5][4000/7330] lr: 1.000e-04, eta: 12:30:00, time: 0.822, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0486, loss_cls: 0.1974, acc: 92.8818, loss_bbox: 0.2422, loss_mask: 0.2495, loss: 0.7600 2024-05-31 05:46:18,423 - mmdet - INFO - Epoch [5][4050/7330] lr: 1.000e-04, eta: 12:29:17, time: 0.809, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0454, loss_cls: 0.1904, acc: 93.0752, loss_bbox: 0.2394, loss_mask: 0.2399, loss: 0.7349 2024-05-31 05:47:01,896 - mmdet - INFO - Epoch [5][4100/7330] lr: 1.000e-04, eta: 12:28:40, time: 0.869, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0467, loss_cls: 0.2002, acc: 92.7053, loss_bbox: 0.2479, loss_mask: 0.2405, loss: 0.7558 2024-05-31 05:47:42,772 - mmdet - INFO - Epoch [5][4150/7330] lr: 1.000e-04, eta: 12:27:58, time: 0.818, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0489, loss_cls: 0.2029, acc: 92.6338, loss_bbox: 0.2548, loss_mask: 0.2460, loss: 0.7730 2024-05-31 05:48:26,131 - mmdet - INFO - Epoch [5][4200/7330] lr: 1.000e-04, eta: 12:27:21, time: 0.867, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0493, loss_cls: 0.2014, acc: 92.6917, loss_bbox: 0.2526, loss_mask: 0.2513, loss: 0.7746 2024-05-31 05:49:09,504 - mmdet - INFO - Epoch [5][4250/7330] lr: 1.000e-04, eta: 12:26:43, time: 0.867, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0496, loss_cls: 0.1940, acc: 92.9941, loss_bbox: 0.2421, loss_mask: 0.2428, loss: 0.7516 2024-05-31 05:49:52,755 - mmdet - INFO - Epoch [5][4300/7330] lr: 1.000e-04, eta: 12:26:05, time: 0.865, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0478, loss_cls: 0.1998, acc: 92.7334, loss_bbox: 0.2550, loss_mask: 0.2477, loss: 0.7701 2024-05-31 05:50:35,899 - mmdet - INFO - Epoch [5][4350/7330] lr: 1.000e-04, eta: 12:25:27, time: 0.863, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0460, loss_cls: 0.1909, acc: 93.0647, loss_bbox: 0.2451, loss_mask: 0.2428, loss: 0.7460 2024-05-31 05:51:19,056 - mmdet - INFO - Epoch [5][4400/7330] lr: 1.000e-04, eta: 12:24:49, time: 0.863, data_time: 0.072, memory: 18874, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0453, loss_cls: 0.1957, acc: 92.9492, loss_bbox: 0.2442, loss_mask: 0.2445, loss: 0.7513 2024-05-31 05:52:00,127 - mmdet - INFO - Epoch [5][4450/7330] lr: 1.000e-04, eta: 12:24:08, time: 0.821, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0479, loss_cls: 0.2049, acc: 92.5208, loss_bbox: 0.2528, loss_mask: 0.2492, loss: 0.7766 2024-05-31 05:52:42,808 - mmdet - INFO - Epoch [5][4500/7330] lr: 1.000e-04, eta: 12:23:29, time: 0.854, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0463, loss_cls: 0.1928, acc: 93.0840, loss_bbox: 0.2378, loss_mask: 0.2370, loss: 0.7343 2024-05-31 05:53:23,179 - mmdet - INFO - Epoch [5][4550/7330] lr: 1.000e-04, eta: 12:22:46, time: 0.807, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0473, loss_cls: 0.1933, acc: 92.8911, loss_bbox: 0.2458, loss_mask: 0.2448, loss: 0.7526 2024-05-31 05:54:04,475 - mmdet - INFO - Epoch [5][4600/7330] lr: 1.000e-04, eta: 12:22:05, time: 0.826, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0505, loss_cls: 0.2022, acc: 92.6677, loss_bbox: 0.2530, loss_mask: 0.2521, loss: 0.7785 2024-05-31 05:54:44,509 - mmdet - INFO - Epoch [5][4650/7330] lr: 1.000e-04, eta: 12:21:22, time: 0.801, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0427, loss_cls: 0.1844, acc: 93.3086, loss_bbox: 0.2305, loss_mask: 0.2407, loss: 0.7180 2024-05-31 05:55:24,934 - mmdet - INFO - Epoch [5][4700/7330] lr: 1.000e-04, eta: 12:20:40, time: 0.808, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0484, loss_cls: 0.1917, acc: 92.9561, loss_bbox: 0.2437, loss_mask: 0.2419, loss: 0.7453 2024-05-31 05:56:06,019 - mmdet - INFO - Epoch [5][4750/7330] lr: 1.000e-04, eta: 12:19:58, time: 0.822, data_time: 0.078, memory: 18874, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0467, loss_cls: 0.1993, acc: 92.7961, loss_bbox: 0.2491, loss_mask: 0.2460, loss: 0.7608 2024-05-31 05:56:47,454 - mmdet - INFO - Epoch [5][4800/7330] lr: 1.000e-04, eta: 12:19:18, time: 0.829, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0472, loss_cls: 0.1950, acc: 92.9780, loss_bbox: 0.2496, loss_mask: 0.2425, loss: 0.7562 2024-05-31 05:57:28,413 - mmdet - INFO - Epoch [5][4850/7330] lr: 1.000e-04, eta: 12:18:36, time: 0.819, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0476, loss_cls: 0.1952, acc: 92.8435, loss_bbox: 0.2524, loss_mask: 0.2449, loss: 0.7599 2024-05-31 05:58:11,662 - mmdet - INFO - Epoch [5][4900/7330] lr: 1.000e-04, eta: 12:17:58, time: 0.865, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0456, loss_cls: 0.1948, acc: 92.9194, loss_bbox: 0.2442, loss_mask: 0.2460, loss: 0.7509 2024-05-31 05:58:52,415 - mmdet - INFO - Epoch [5][4950/7330] lr: 1.000e-04, eta: 12:17:16, time: 0.815, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0484, loss_cls: 0.1963, acc: 92.9092, loss_bbox: 0.2448, loss_mask: 0.2402, loss: 0.7491 2024-05-31 05:59:33,608 - mmdet - INFO - Epoch [5][5000/7330] lr: 1.000e-04, eta: 12:16:35, time: 0.824, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0457, loss_cls: 0.1893, acc: 93.1331, loss_bbox: 0.2338, loss_mask: 0.2374, loss: 0.7285 2024-05-31 06:00:14,143 - mmdet - INFO - Epoch [5][5050/7330] lr: 1.000e-04, eta: 12:15:53, time: 0.811, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0464, loss_cls: 0.1948, acc: 92.8613, loss_bbox: 0.2445, loss_mask: 0.2403, loss: 0.7455 2024-05-31 06:00:54,691 - mmdet - INFO - Epoch [5][5100/7330] lr: 1.000e-04, eta: 12:15:10, time: 0.811, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0463, loss_cls: 0.1975, acc: 92.7756, loss_bbox: 0.2484, loss_mask: 0.2408, loss: 0.7545 2024-05-31 06:01:38,205 - mmdet - INFO - Epoch [5][5150/7330] lr: 1.000e-04, eta: 12:14:33, time: 0.870, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0463, loss_cls: 0.1925, acc: 93.0488, loss_bbox: 0.2438, loss_mask: 0.2440, loss: 0.7459 2024-05-31 06:02:19,199 - mmdet - INFO - Epoch [5][5200/7330] lr: 1.000e-04, eta: 12:13:51, time: 0.819, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0460, loss_cls: 0.1963, acc: 92.8147, loss_bbox: 0.2465, loss_mask: 0.2479, loss: 0.7581 2024-05-31 06:03:00,420 - mmdet - INFO - Epoch [5][5250/7330] lr: 1.000e-04, eta: 12:13:10, time: 0.825, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0493, loss_cls: 0.1981, acc: 92.9370, loss_bbox: 0.2426, loss_mask: 0.2498, loss: 0.7610 2024-05-31 06:03:45,402 - mmdet - INFO - Epoch [5][5300/7330] lr: 1.000e-04, eta: 12:12:35, time: 0.900, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0453, loss_cls: 0.1998, acc: 92.7957, loss_bbox: 0.2497, loss_mask: 0.2472, loss: 0.7627 2024-05-31 06:04:28,870 - mmdet - INFO - Epoch [5][5350/7330] lr: 1.000e-04, eta: 12:11:57, time: 0.869, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0451, loss_cls: 0.1939, acc: 92.9358, loss_bbox: 0.2406, loss_mask: 0.2409, loss: 0.7408 2024-05-31 06:05:09,306 - mmdet - INFO - Epoch [5][5400/7330] lr: 1.000e-04, eta: 12:11:15, time: 0.808, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0448, loss_cls: 0.1882, acc: 93.1772, loss_bbox: 0.2389, loss_mask: 0.2504, loss: 0.7406 2024-05-31 06:05:54,999 - mmdet - INFO - Epoch [5][5450/7330] lr: 1.000e-04, eta: 12:10:40, time: 0.914, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0483, loss_cls: 0.2013, acc: 92.6287, loss_bbox: 0.2561, loss_mask: 0.2507, loss: 0.7781 2024-05-31 06:06:36,041 - mmdet - INFO - Epoch [5][5500/7330] lr: 1.000e-04, eta: 12:09:59, time: 0.821, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0435, loss_cls: 0.1842, acc: 93.2996, loss_bbox: 0.2351, loss_mask: 0.2378, loss: 0.7186 2024-05-31 06:07:20,102 - mmdet - INFO - Epoch [5][5550/7330] lr: 1.000e-04, eta: 12:09:18, time: 0.824, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0482, loss_cls: 0.1979, acc: 92.7520, loss_bbox: 0.2477, loss_mask: 0.2467, loss: 0.7617 2024-05-31 06:08:00,680 - mmdet - INFO - Epoch [5][5600/7330] lr: 1.000e-04, eta: 12:08:40, time: 0.869, data_time: 0.109, memory: 18874, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0475, loss_cls: 0.1846, acc: 93.2620, loss_bbox: 0.2388, loss_mask: 0.2472, loss: 0.7397 2024-05-31 06:08:41,127 - mmdet - INFO - Epoch [5][5650/7330] lr: 1.000e-04, eta: 12:07:57, time: 0.809, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0454, loss_cls: 0.1936, acc: 92.9351, loss_bbox: 0.2418, loss_mask: 0.2463, loss: 0.7475 2024-05-31 06:09:21,839 - mmdet - INFO - Epoch [5][5700/7330] lr: 1.000e-04, eta: 12:07:15, time: 0.814, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0443, loss_cls: 0.1910, acc: 93.1819, loss_bbox: 0.2341, loss_mask: 0.2367, loss: 0.7251 2024-05-31 06:10:02,297 - mmdet - INFO - Epoch [5][5750/7330] lr: 1.000e-04, eta: 12:06:33, time: 0.809, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0482, loss_cls: 0.1970, acc: 92.8650, loss_bbox: 0.2513, loss_mask: 0.2495, loss: 0.7651 2024-05-31 06:10:43,104 - mmdet - INFO - Epoch [5][5800/7330] lr: 1.000e-04, eta: 12:05:51, time: 0.816, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0458, loss_cls: 0.1969, acc: 92.7852, loss_bbox: 0.2491, loss_mask: 0.2544, loss: 0.7654 2024-05-31 06:11:24,236 - mmdet - INFO - Epoch [5][5850/7330] lr: 1.000e-04, eta: 12:05:10, time: 0.823, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0484, loss_cls: 0.2048, acc: 92.6511, loss_bbox: 0.2502, loss_mask: 0.2460, loss: 0.7708 2024-05-31 06:12:05,355 - mmdet - INFO - Epoch [5][5900/7330] lr: 1.000e-04, eta: 12:04:29, time: 0.822, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0440, loss_cls: 0.2014, acc: 92.5447, loss_bbox: 0.2576, loss_mask: 0.2421, loss: 0.7654 2024-05-31 06:12:49,218 - mmdet - INFO - Epoch [5][5950/7330] lr: 1.000e-04, eta: 12:03:51, time: 0.877, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0462, loss_cls: 0.1913, acc: 93.0063, loss_bbox: 0.2364, loss_mask: 0.2302, loss: 0.7240 2024-05-31 06:13:29,880 - mmdet - INFO - Epoch [5][6000/7330] lr: 1.000e-04, eta: 12:03:09, time: 0.813, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0430, loss_cls: 0.1828, acc: 93.3235, loss_bbox: 0.2304, loss_mask: 0.2399, loss: 0.7144 2024-05-31 06:14:11,111 - mmdet - INFO - Epoch [5][6050/7330] lr: 1.000e-04, eta: 12:02:28, time: 0.825, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0464, loss_cls: 0.1958, acc: 92.6455, loss_bbox: 0.2499, loss_mask: 0.2436, loss: 0.7548 2024-05-31 06:14:51,872 - mmdet - INFO - Epoch [5][6100/7330] lr: 1.000e-04, eta: 12:01:46, time: 0.815, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0456, loss_cls: 0.1913, acc: 93.0186, loss_bbox: 0.2451, loss_mask: 0.2468, loss: 0.7487 2024-05-31 06:15:32,479 - mmdet - INFO - Epoch [5][6150/7330] lr: 1.000e-04, eta: 12:01:04, time: 0.812, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0449, loss_cls: 0.1907, acc: 93.0449, loss_bbox: 0.2385, loss_mask: 0.2414, loss: 0.7350 2024-05-31 06:16:13,744 - mmdet - INFO - Epoch [5][6200/7330] lr: 1.000e-04, eta: 12:00:23, time: 0.825, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0457, loss_cls: 0.1957, acc: 92.9255, loss_bbox: 0.2408, loss_mask: 0.2409, loss: 0.7436 2024-05-31 06:16:56,466 - mmdet - INFO - Epoch [5][6250/7330] lr: 1.000e-04, eta: 11:59:44, time: 0.855, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0423, loss_cls: 0.1845, acc: 93.3652, loss_bbox: 0.2330, loss_mask: 0.2337, loss: 0.7131 2024-05-31 06:17:36,901 - mmdet - INFO - Epoch [5][6300/7330] lr: 1.000e-04, eta: 11:59:02, time: 0.809, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0466, loss_cls: 0.1908, acc: 93.0654, loss_bbox: 0.2415, loss_mask: 0.2384, loss: 0.7371 2024-05-31 06:18:19,864 - mmdet - INFO - Epoch [5][6350/7330] lr: 1.000e-04, eta: 11:58:23, time: 0.859, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0473, loss_cls: 0.1980, acc: 92.7393, loss_bbox: 0.2468, loss_mask: 0.2417, loss: 0.7544 2024-05-31 06:19:04,699 - mmdet - INFO - Epoch [5][6400/7330] lr: 1.000e-04, eta: 11:57:47, time: 0.896, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0462, loss_cls: 0.1916, acc: 92.9587, loss_bbox: 0.2419, loss_mask: 0.2412, loss: 0.7393 2024-05-31 06:19:45,624 - mmdet - INFO - Epoch [5][6450/7330] lr: 1.000e-04, eta: 11:57:05, time: 0.819, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0434, loss_cls: 0.1926, acc: 93.1492, loss_bbox: 0.2366, loss_mask: 0.2400, loss: 0.7325 2024-05-31 06:20:29,936 - mmdet - INFO - Epoch [5][6500/7330] lr: 1.000e-04, eta: 11:56:29, time: 0.886, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0435, loss_cls: 0.1907, acc: 93.2627, loss_bbox: 0.2335, loss_mask: 0.2338, loss: 0.7211 2024-05-31 06:21:10,868 - mmdet - INFO - Epoch [5][6550/7330] lr: 1.000e-04, eta: 11:55:47, time: 0.819, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0472, loss_cls: 0.1954, acc: 92.9116, loss_bbox: 0.2454, loss_mask: 0.2444, loss: 0.7555 2024-05-31 06:21:50,976 - mmdet - INFO - Epoch [5][6600/7330] lr: 1.000e-04, eta: 11:55:04, time: 0.802, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0448, loss_cls: 0.1870, acc: 93.1929, loss_bbox: 0.2371, loss_mask: 0.2426, loss: 0.7314 2024-05-31 06:22:33,668 - mmdet - INFO - Epoch [5][6650/7330] lr: 1.000e-04, eta: 11:54:25, time: 0.854, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0459, loss_cls: 0.1972, acc: 92.9395, loss_bbox: 0.2413, loss_mask: 0.2400, loss: 0.7458 2024-05-31 06:23:14,138 - mmdet - INFO - Epoch [5][6700/7330] lr: 1.000e-04, eta: 11:53:43, time: 0.809, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0477, loss_cls: 0.1947, acc: 92.9497, loss_bbox: 0.2445, loss_mask: 0.2455, loss: 0.7540 2024-05-31 06:23:54,575 - mmdet - INFO - Epoch [5][6750/7330] lr: 1.000e-04, eta: 11:53:00, time: 0.809, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0438, loss_cls: 0.1874, acc: 93.2209, loss_bbox: 0.2404, loss_mask: 0.2365, loss: 0.7269 2024-05-31 06:24:35,122 - mmdet - INFO - Epoch [5][6800/7330] lr: 1.000e-04, eta: 11:52:18, time: 0.811, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0460, loss_cls: 0.1977, acc: 92.9060, loss_bbox: 0.2412, loss_mask: 0.2401, loss: 0.7435 2024-05-31 06:25:15,937 - mmdet - INFO - Epoch [5][6850/7330] lr: 1.000e-04, eta: 11:51:36, time: 0.816, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0457, loss_cls: 0.1887, acc: 93.0762, loss_bbox: 0.2338, loss_mask: 0.2384, loss: 0.7260 2024-05-31 06:25:57,585 - mmdet - INFO - Epoch [5][6900/7330] lr: 1.000e-04, eta: 11:50:56, time: 0.833, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0475, loss_cls: 0.1939, acc: 92.8660, loss_bbox: 0.2469, loss_mask: 0.2442, loss: 0.7538 2024-05-31 06:26:37,939 - mmdet - INFO - Epoch [5][6950/7330] lr: 1.000e-04, eta: 11:50:13, time: 0.807, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0477, loss_cls: 0.1986, acc: 92.8354, loss_bbox: 0.2487, loss_mask: 0.2488, loss: 0.7660 2024-05-31 06:27:21,028 - mmdet - INFO - Epoch [5][7000/7330] lr: 1.000e-04, eta: 11:49:35, time: 0.862, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0420, loss_cls: 0.1833, acc: 93.4016, loss_bbox: 0.2239, loss_mask: 0.2374, loss: 0.7070 2024-05-31 06:28:01,885 - mmdet - INFO - Epoch [5][7050/7330] lr: 1.000e-04, eta: 11:48:53, time: 0.817, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0459, loss_cls: 0.1939, acc: 92.8376, loss_bbox: 0.2460, loss_mask: 0.2374, loss: 0.7429 2024-05-31 06:28:42,672 - mmdet - INFO - Epoch [5][7100/7330] lr: 1.000e-04, eta: 11:48:11, time: 0.816, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0463, loss_cls: 0.1888, acc: 93.1162, loss_bbox: 0.2379, loss_mask: 0.2416, loss: 0.7348 2024-05-31 06:29:23,409 - mmdet - INFO - Epoch [5][7150/7330] lr: 1.000e-04, eta: 11:47:29, time: 0.815, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0462, loss_cls: 0.1865, acc: 93.2427, loss_bbox: 0.2388, loss_mask: 0.2425, loss: 0.7362 2024-05-31 06:30:04,156 - mmdet - INFO - Epoch [5][7200/7330] lr: 1.000e-04, eta: 11:46:47, time: 0.815, data_time: 0.072, memory: 18874, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0482, loss_cls: 0.1959, acc: 92.7434, loss_bbox: 0.2525, loss_mask: 0.2495, loss: 0.7647 2024-05-31 06:30:45,229 - mmdet - INFO - Epoch [5][7250/7330] lr: 1.000e-04, eta: 11:46:06, time: 0.821, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0470, loss_cls: 0.1953, acc: 92.8501, loss_bbox: 0.2460, loss_mask: 0.2432, loss: 0.7513 2024-05-31 06:31:27,492 - mmdet - INFO - Epoch [5][7300/7330] lr: 1.000e-04, eta: 11:45:26, time: 0.845, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0473, loss_cls: 0.1935, acc: 92.9338, loss_bbox: 0.2421, loss_mask: 0.2365, loss: 0.7390 2024-05-31 06:31:52,739 - mmdet - INFO - Saving checkpoint at 5 epochs 2024-05-31 06:33:43,245 - mmdet - INFO - Evaluating bbox... 2024-05-31 06:34:07,623 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.447 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.689 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.489 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.272 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.491 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.609 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.571 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.571 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.571 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.375 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.621 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.737 2024-05-31 06:34:07,623 - mmdet - INFO - Evaluating segm... 2024-05-31 06:34:32,457 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.408 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.650 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.438 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.196 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.443 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.626 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.524 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.524 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.524 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.314 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.573 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.714 2024-05-31 06:34:32,855 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 06:34:32,857 - mmdet - INFO - Epoch(val) [5][625] bbox_mAP: 0.4470, bbox_mAP_50: 0.6890, bbox_mAP_75: 0.4890, bbox_mAP_s: 0.2720, bbox_mAP_m: 0.4910, bbox_mAP_l: 0.6090, bbox_mAP_copypaste: 0.447 0.689 0.489 0.272 0.491 0.609, segm_mAP: 0.4080, segm_mAP_50: 0.6500, segm_mAP_75: 0.4380, segm_mAP_s: 0.1960, segm_mAP_m: 0.4430, segm_mAP_l: 0.6260, segm_mAP_copypaste: 0.408 0.650 0.438 0.196 0.443 0.626 2024-05-31 06:35:30,258 - mmdet - INFO - Epoch [6][50/7330] lr: 1.000e-04, eta: 11:44:08, time: 1.148, data_time: 0.126, memory: 18874, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0447, loss_cls: 0.1823, acc: 93.3557, loss_bbox: 0.2327, loss_mask: 0.2341, loss: 0.7115 2024-05-31 06:36:10,773 - mmdet - INFO - Epoch [6][100/7330] lr: 1.000e-04, eta: 11:43:26, time: 0.810, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0428, loss_cls: 0.1775, acc: 93.4358, loss_bbox: 0.2271, loss_mask: 0.2340, loss: 0.6986 2024-05-31 06:36:51,960 - mmdet - INFO - Epoch [6][150/7330] lr: 1.000e-04, eta: 11:42:45, time: 0.824, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0443, loss_cls: 0.1888, acc: 93.1055, loss_bbox: 0.2395, loss_mask: 0.2373, loss: 0.7290 2024-05-31 06:37:33,531 - mmdet - INFO - Epoch [6][200/7330] lr: 1.000e-04, eta: 11:42:04, time: 0.831, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0449, loss_cls: 0.1805, acc: 93.2856, loss_bbox: 0.2403, loss_mask: 0.2365, loss: 0.7202 2024-05-31 06:38:14,036 - mmdet - INFO - Epoch [6][250/7330] lr: 1.000e-04, eta: 11:41:22, time: 0.810, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0412, loss_cls: 0.1780, acc: 93.5476, loss_bbox: 0.2219, loss_mask: 0.2287, loss: 0.6866 2024-05-31 06:38:55,255 - mmdet - INFO - Epoch [6][300/7330] lr: 1.000e-04, eta: 11:40:41, time: 0.824, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0450, loss_cls: 0.1899, acc: 93.0015, loss_bbox: 0.2441, loss_mask: 0.2443, loss: 0.7425 2024-05-31 06:39:38,439 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 06:39:38,439 - mmdet - INFO - Epoch [6][350/7330] lr: 1.000e-04, eta: 11:40:02, time: 0.864, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0444, loss_cls: 0.1890, acc: 92.9993, loss_bbox: 0.2473, loss_mask: 0.2369, loss: 0.7360 2024-05-31 06:40:19,083 - mmdet - INFO - Epoch [6][400/7330] lr: 1.000e-04, eta: 11:39:20, time: 0.813, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0438, loss_cls: 0.1865, acc: 93.1533, loss_bbox: 0.2397, loss_mask: 0.2321, loss: 0.7193 2024-05-31 06:41:00,296 - mmdet - INFO - Epoch [6][450/7330] lr: 1.000e-04, eta: 11:38:39, time: 0.824, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0442, loss_cls: 0.1808, acc: 93.4001, loss_bbox: 0.2297, loss_mask: 0.2373, loss: 0.7106 2024-05-31 06:41:42,027 - mmdet - INFO - Epoch [6][500/7330] lr: 1.000e-04, eta: 11:37:59, time: 0.834, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0454, loss_cls: 0.1851, acc: 93.3313, loss_bbox: 0.2311, loss_mask: 0.2377, loss: 0.7176 2024-05-31 06:42:22,536 - mmdet - INFO - Epoch [6][550/7330] lr: 1.000e-04, eta: 11:37:16, time: 0.811, data_time: 0.072, memory: 18874, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0458, loss_cls: 0.1854, acc: 93.2429, loss_bbox: 0.2367, loss_mask: 0.2402, loss: 0.7269 2024-05-31 06:43:03,519 - mmdet - INFO - Epoch [6][600/7330] lr: 1.000e-04, eta: 11:36:35, time: 0.820, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0480, loss_cls: 0.1890, acc: 92.9978, loss_bbox: 0.2412, loss_mask: 0.2395, loss: 0.7374 2024-05-31 06:43:44,463 - mmdet - INFO - Epoch [6][650/7330] lr: 1.000e-04, eta: 11:35:53, time: 0.819, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0453, loss_cls: 0.1942, acc: 92.9487, loss_bbox: 0.2428, loss_mask: 0.2429, loss: 0.7434 2024-05-31 06:44:25,989 - mmdet - INFO - Epoch [6][700/7330] lr: 1.000e-04, eta: 11:35:13, time: 0.830, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0432, loss_cls: 0.1920, acc: 93.1309, loss_bbox: 0.2370, loss_mask: 0.2305, loss: 0.7208 2024-05-31 06:45:06,286 - mmdet - INFO - Epoch [6][750/7330] lr: 1.000e-04, eta: 11:34:30, time: 0.806, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0392, loss_cls: 0.1781, acc: 93.5276, loss_bbox: 0.2272, loss_mask: 0.2313, loss: 0.6929 2024-05-31 06:45:47,148 - mmdet - INFO - Epoch [6][800/7330] lr: 1.000e-04, eta: 11:33:48, time: 0.817, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0446, loss_cls: 0.1852, acc: 93.2046, loss_bbox: 0.2317, loss_mask: 0.2351, loss: 0.7142 2024-05-31 06:46:28,189 - mmdet - INFO - Epoch [6][850/7330] lr: 1.000e-04, eta: 11:33:07, time: 0.821, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0434, loss_cls: 0.1850, acc: 93.1631, loss_bbox: 0.2359, loss_mask: 0.2437, loss: 0.7260 2024-05-31 06:47:11,336 - mmdet - INFO - Epoch [6][900/7330] lr: 1.000e-04, eta: 11:32:28, time: 0.863, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0431, loss_cls: 0.1850, acc: 93.1086, loss_bbox: 0.2324, loss_mask: 0.2369, loss: 0.7151 2024-05-31 06:47:51,957 - mmdet - INFO - Epoch [6][950/7330] lr: 1.000e-04, eta: 11:31:46, time: 0.812, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0445, loss_cls: 0.1818, acc: 93.3696, loss_bbox: 0.2348, loss_mask: 0.2353, loss: 0.7166 2024-05-31 06:48:32,933 - mmdet - INFO - Epoch [6][1000/7330] lr: 1.000e-04, eta: 11:31:05, time: 0.819, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0435, loss_cls: 0.1827, acc: 93.2515, loss_bbox: 0.2288, loss_mask: 0.2398, loss: 0.7130 2024-05-31 06:49:13,823 - mmdet - INFO - Epoch [6][1050/7330] lr: 1.000e-04, eta: 11:30:23, time: 0.818, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0452, loss_cls: 0.1808, acc: 93.2305, loss_bbox: 0.2403, loss_mask: 0.2433, loss: 0.7273 2024-05-31 06:49:57,980 - mmdet - INFO - Epoch [6][1100/7330] lr: 1.000e-04, eta: 11:29:46, time: 0.883, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0458, loss_cls: 0.1923, acc: 92.8647, loss_bbox: 0.2476, loss_mask: 0.2369, loss: 0.7421 2024-05-31 06:50:43,730 - mmdet - INFO - Epoch [6][1150/7330] lr: 1.000e-04, eta: 11:29:11, time: 0.915, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0466, loss_cls: 0.1866, acc: 93.0825, loss_bbox: 0.2425, loss_mask: 0.2356, loss: 0.7309 2024-05-31 06:51:24,712 - mmdet - INFO - Epoch [6][1200/7330] lr: 1.000e-04, eta: 11:28:29, time: 0.820, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0443, loss_cls: 0.1938, acc: 92.8440, loss_bbox: 0.2467, loss_mask: 0.2363, loss: 0.7403 2024-05-31 06:52:10,856 - mmdet - INFO - Epoch [6][1250/7330] lr: 1.000e-04, eta: 11:27:55, time: 0.923, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0429, loss_cls: 0.1803, acc: 93.2908, loss_bbox: 0.2315, loss_mask: 0.2382, loss: 0.7100 2024-05-31 06:52:52,798 - mmdet - INFO - Epoch [6][1300/7330] lr: 1.000e-04, eta: 11:27:14, time: 0.839, data_time: 0.078, memory: 18874, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0478, loss_cls: 0.1953, acc: 92.7686, loss_bbox: 0.2517, loss_mask: 0.2430, loss: 0.7581 2024-05-31 06:53:33,382 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 06:53:33,382 - mmdet - INFO - Epoch [6][1350/7330] lr: 1.000e-04, eta: 11:26:32, time: 0.812, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0451, loss_cls: 0.1938, acc: 93.0100, loss_bbox: 0.2387, loss_mask: 0.2388, loss: 0.7355 2024-05-31 06:54:16,253 - mmdet - INFO - Epoch [6][1400/7330] lr: 1.000e-04, eta: 11:25:53, time: 0.857, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0453, loss_cls: 0.1891, acc: 93.1162, loss_bbox: 0.2399, loss_mask: 0.2435, loss: 0.7383 2024-05-31 06:54:59,440 - mmdet - INFO - Epoch [6][1450/7330] lr: 1.000e-04, eta: 11:25:14, time: 0.864, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0440, loss_cls: 0.1829, acc: 93.2419, loss_bbox: 0.2323, loss_mask: 0.2351, loss: 0.7114 2024-05-31 06:55:40,389 - mmdet - INFO - Epoch [6][1500/7330] lr: 1.000e-04, eta: 11:24:33, time: 0.819, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0445, loss_cls: 0.1853, acc: 93.2192, loss_bbox: 0.2375, loss_mask: 0.2376, loss: 0.7235 2024-05-31 06:56:21,225 - mmdet - INFO - Epoch [6][1550/7330] lr: 1.000e-04, eta: 11:23:51, time: 0.817, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0461, loss_cls: 0.1876, acc: 93.0146, loss_bbox: 0.2421, loss_mask: 0.2372, loss: 0.7306 2024-05-31 06:57:02,366 - mmdet - INFO - Epoch [6][1600/7330] lr: 1.000e-04, eta: 11:23:10, time: 0.823, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0471, loss_cls: 0.1900, acc: 93.0781, loss_bbox: 0.2420, loss_mask: 0.2396, loss: 0.7369 2024-05-31 06:57:42,817 - mmdet - INFO - Epoch [6][1650/7330] lr: 1.000e-04, eta: 11:22:28, time: 0.809, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0421, loss_cls: 0.1779, acc: 93.4146, loss_bbox: 0.2295, loss_mask: 0.2314, loss: 0.6979 2024-05-31 06:58:23,549 - mmdet - INFO - Epoch [6][1700/7330] lr: 1.000e-04, eta: 11:21:46, time: 0.815, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0474, loss_cls: 0.1947, acc: 92.7339, loss_bbox: 0.2475, loss_mask: 0.2420, loss: 0.7517 2024-05-31 06:59:04,369 - mmdet - INFO - Epoch [6][1750/7330] lr: 1.000e-04, eta: 11:21:04, time: 0.816, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0468, loss_cls: 0.1867, acc: 93.1482, loss_bbox: 0.2392, loss_mask: 0.2419, loss: 0.7334 2024-05-31 06:59:45,163 - mmdet - INFO - Epoch [6][1800/7330] lr: 1.000e-04, eta: 11:20:22, time: 0.816, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0484, loss_cls: 0.1928, acc: 92.8044, loss_bbox: 0.2499, loss_mask: 0.2425, loss: 0.7528 2024-05-31 07:00:25,269 - mmdet - INFO - Epoch [6][1850/7330] lr: 1.000e-04, eta: 11:19:39, time: 0.802, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0452, loss_cls: 0.1834, acc: 93.2146, loss_bbox: 0.2372, loss_mask: 0.2331, loss: 0.7160 2024-05-31 07:01:06,222 - mmdet - INFO - Epoch [6][1900/7330] lr: 1.000e-04, eta: 11:18:58, time: 0.819, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0444, loss_cls: 0.1871, acc: 93.0798, loss_bbox: 0.2342, loss_mask: 0.2324, loss: 0.7160 2024-05-31 07:01:51,061 - mmdet - INFO - Epoch [6][1950/7330] lr: 1.000e-04, eta: 11:18:21, time: 0.897, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0458, loss_cls: 0.1935, acc: 92.9360, loss_bbox: 0.2436, loss_mask: 0.2402, loss: 0.7408 2024-05-31 07:02:31,889 - mmdet - INFO - Epoch [6][2000/7330] lr: 1.000e-04, eta: 11:17:40, time: 0.817, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0455, loss_cls: 0.1829, acc: 93.1829, loss_bbox: 0.2354, loss_mask: 0.2375, loss: 0.7189 2024-05-31 07:03:12,712 - mmdet - INFO - Epoch [6][2050/7330] lr: 1.000e-04, eta: 11:16:58, time: 0.816, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0450, loss_cls: 0.1905, acc: 93.0088, loss_bbox: 0.2447, loss_mask: 0.2435, loss: 0.7418 2024-05-31 07:03:53,258 - mmdet - INFO - Epoch [6][2100/7330] lr: 1.000e-04, eta: 11:16:16, time: 0.811, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0443, loss_cls: 0.1888, acc: 93.0701, loss_bbox: 0.2424, loss_mask: 0.2423, loss: 0.7362 2024-05-31 07:04:36,726 - mmdet - INFO - Epoch [6][2150/7330] lr: 1.000e-04, eta: 11:15:37, time: 0.869, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0472, loss_cls: 0.1891, acc: 93.1372, loss_bbox: 0.2412, loss_mask: 0.2402, loss: 0.7367 2024-05-31 07:05:22,430 - mmdet - INFO - Epoch [6][2200/7330] lr: 1.000e-04, eta: 11:15:02, time: 0.914, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0480, loss_cls: 0.1911, acc: 93.0039, loss_bbox: 0.2456, loss_mask: 0.2408, loss: 0.7439 2024-05-31 07:06:02,886 - mmdet - INFO - Epoch [6][2250/7330] lr: 1.000e-04, eta: 11:14:19, time: 0.809, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0436, loss_cls: 0.1750, acc: 93.4470, loss_bbox: 0.2271, loss_mask: 0.2368, loss: 0.7009 2024-05-31 07:06:49,027 - mmdet - INFO - Epoch [6][2300/7330] lr: 1.000e-04, eta: 11:13:44, time: 0.923, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0429, loss_cls: 0.1814, acc: 93.3667, loss_bbox: 0.2277, loss_mask: 0.2295, loss: 0.6996 2024-05-31 07:07:30,046 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 07:07:30,046 - mmdet - INFO - Epoch [6][2350/7330] lr: 1.000e-04, eta: 11:13:03, time: 0.820, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0497, loss_cls: 0.1910, acc: 92.9441, loss_bbox: 0.2435, loss_mask: 0.2439, loss: 0.7475 2024-05-31 07:08:10,641 - mmdet - INFO - Epoch [6][2400/7330] lr: 1.000e-04, eta: 11:12:21, time: 0.812, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0431, loss_cls: 0.1811, acc: 93.3542, loss_bbox: 0.2296, loss_mask: 0.2323, loss: 0.7040 2024-05-31 07:08:53,072 - mmdet - INFO - Epoch [6][2450/7330] lr: 1.000e-04, eta: 11:11:41, time: 0.849, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0422, loss_cls: 0.1771, acc: 93.4885, loss_bbox: 0.2265, loss_mask: 0.2298, loss: 0.6927 2024-05-31 07:09:33,903 - mmdet - INFO - Epoch [6][2500/7330] lr: 1.000e-04, eta: 11:10:59, time: 0.817, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0441, loss_cls: 0.1848, acc: 93.2686, loss_bbox: 0.2360, loss_mask: 0.2394, loss: 0.7244 2024-05-31 07:10:18,891 - mmdet - INFO - Epoch [6][2550/7330] lr: 1.000e-04, eta: 11:10:23, time: 0.900, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0454, loss_cls: 0.1796, acc: 93.3850, loss_bbox: 0.2341, loss_mask: 0.2359, loss: 0.7126 2024-05-31 07:11:00,511 - mmdet - INFO - Epoch [6][2600/7330] lr: 1.000e-04, eta: 11:09:42, time: 0.832, data_time: 0.077, memory: 18874, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0442, loss_cls: 0.1837, acc: 93.2224, loss_bbox: 0.2393, loss_mask: 0.2387, loss: 0.7252 2024-05-31 07:11:41,453 - mmdet - INFO - Epoch [6][2650/7330] lr: 1.000e-04, eta: 11:09:00, time: 0.819, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0455, loss_cls: 0.1801, acc: 93.3538, loss_bbox: 0.2333, loss_mask: 0.2351, loss: 0.7116 2024-05-31 07:12:22,051 - mmdet - INFO - Epoch [6][2700/7330] lr: 1.000e-04, eta: 11:08:18, time: 0.812, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0458, loss_cls: 0.1902, acc: 93.0034, loss_bbox: 0.2440, loss_mask: 0.2405, loss: 0.7405 2024-05-31 07:13:02,600 - mmdet - INFO - Epoch [6][2750/7330] lr: 1.000e-04, eta: 11:07:36, time: 0.811, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0437, loss_cls: 0.1855, acc: 93.2507, loss_bbox: 0.2309, loss_mask: 0.2375, loss: 0.7156 2024-05-31 07:13:43,633 - mmdet - INFO - Epoch [6][2800/7330] lr: 1.000e-04, eta: 11:06:55, time: 0.821, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0445, loss_cls: 0.1870, acc: 93.1687, loss_bbox: 0.2399, loss_mask: 0.2411, loss: 0.7308 2024-05-31 07:14:24,624 - mmdet - INFO - Epoch [6][2850/7330] lr: 1.000e-04, eta: 11:06:13, time: 0.820, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0423, loss_cls: 0.1797, acc: 93.3875, loss_bbox: 0.2355, loss_mask: 0.2381, loss: 0.7132 2024-05-31 07:15:04,816 - mmdet - INFO - Epoch [6][2900/7330] lr: 1.000e-04, eta: 11:05:31, time: 0.804, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0468, loss_cls: 0.1902, acc: 92.9719, loss_bbox: 0.2446, loss_mask: 0.2429, loss: 0.7422 2024-05-31 07:15:45,682 - mmdet - INFO - Epoch [6][2950/7330] lr: 1.000e-04, eta: 11:04:49, time: 0.817, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0465, loss_cls: 0.1930, acc: 92.8921, loss_bbox: 0.2440, loss_mask: 0.2428, loss: 0.7457 2024-05-31 07:16:29,125 - mmdet - INFO - Epoch [6][3000/7330] lr: 1.000e-04, eta: 11:04:10, time: 0.869, data_time: 0.075, memory: 18874, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0444, loss_cls: 0.1921, acc: 92.9497, loss_bbox: 0.2434, loss_mask: 0.2356, loss: 0.7349 2024-05-31 07:17:09,878 - mmdet - INFO - Epoch [6][3050/7330] lr: 1.000e-04, eta: 11:03:29, time: 0.815, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0449, loss_cls: 0.1866, acc: 93.0696, loss_bbox: 0.2418, loss_mask: 0.2406, loss: 0.7319 2024-05-31 07:17:50,589 - mmdet - INFO - Epoch [6][3100/7330] lr: 1.000e-04, eta: 11:02:47, time: 0.814, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0433, loss_cls: 0.1872, acc: 93.1531, loss_bbox: 0.2357, loss_mask: 0.2387, loss: 0.7227 2024-05-31 07:18:31,304 - mmdet - INFO - Epoch [6][3150/7330] lr: 1.000e-04, eta: 11:02:05, time: 0.814, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0438, loss_cls: 0.1876, acc: 93.0032, loss_bbox: 0.2382, loss_mask: 0.2391, loss: 0.7256 2024-05-31 07:19:15,335 - mmdet - INFO - Epoch [6][3200/7330] lr: 1.000e-04, eta: 11:01:27, time: 0.881, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0425, loss_cls: 0.1850, acc: 93.3433, loss_bbox: 0.2325, loss_mask: 0.2370, loss: 0.7153 2024-05-31 07:20:00,208 - mmdet - INFO - Epoch [6][3250/7330] lr: 1.000e-04, eta: 11:00:50, time: 0.897, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0426, loss_cls: 0.1810, acc: 93.3667, loss_bbox: 0.2315, loss_mask: 0.2327, loss: 0.7062 2024-05-31 07:20:40,535 - mmdet - INFO - Epoch [6][3300/7330] lr: 1.000e-04, eta: 11:00:08, time: 0.806, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0434, loss_cls: 0.1802, acc: 93.3640, loss_bbox: 0.2297, loss_mask: 0.2402, loss: 0.7130 2024-05-31 07:21:26,241 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 07:21:26,241 - mmdet - INFO - Epoch [6][3350/7330] lr: 1.000e-04, eta: 10:59:32, time: 0.914, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0459, loss_cls: 0.1857, acc: 93.0898, loss_bbox: 0.2377, loss_mask: 0.2422, loss: 0.7311 2024-05-31 07:22:06,398 - mmdet - INFO - Epoch [6][3400/7330] lr: 1.000e-04, eta: 10:58:49, time: 0.803, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0422, loss_cls: 0.1791, acc: 93.3120, loss_bbox: 0.2306, loss_mask: 0.2343, loss: 0.7022 2024-05-31 07:22:47,246 - mmdet - INFO - Epoch [6][3450/7330] lr: 1.000e-04, eta: 10:58:07, time: 0.816, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0434, loss_cls: 0.1843, acc: 93.1853, loss_bbox: 0.2370, loss_mask: 0.2387, loss: 0.7224 2024-05-31 07:23:30,021 - mmdet - INFO - Epoch [6][3500/7330] lr: 1.000e-04, eta: 10:57:28, time: 0.856, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0425, loss_cls: 0.1699, acc: 93.7769, loss_bbox: 0.2174, loss_mask: 0.2293, loss: 0.6763 2024-05-31 07:24:10,052 - mmdet - INFO - Epoch [6][3550/7330] lr: 1.000e-04, eta: 10:56:45, time: 0.801, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0450, loss_cls: 0.1886, acc: 93.0212, loss_bbox: 0.2405, loss_mask: 0.2353, loss: 0.7280 2024-05-31 07:24:52,770 - mmdet - INFO - Epoch [6][3600/7330] lr: 1.000e-04, eta: 10:56:06, time: 0.854, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0450, loss_cls: 0.1881, acc: 93.0701, loss_bbox: 0.2403, loss_mask: 0.2364, loss: 0.7268 2024-05-31 07:25:33,355 - mmdet - INFO - Epoch [6][3650/7330] lr: 1.000e-04, eta: 10:55:24, time: 0.812, data_time: 0.041, memory: 18874, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0439, loss_cls: 0.1827, acc: 93.2971, loss_bbox: 0.2301, loss_mask: 0.2391, loss: 0.7137 2024-05-31 07:26:14,405 - mmdet - INFO - Epoch [6][3700/7330] lr: 1.000e-04, eta: 10:54:42, time: 0.821, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0453, loss_cls: 0.1841, acc: 93.2195, loss_bbox: 0.2405, loss_mask: 0.2378, loss: 0.7274 2024-05-31 07:26:54,476 - mmdet - INFO - Epoch [6][3750/7330] lr: 1.000e-04, eta: 10:54:00, time: 0.801, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0418, loss_cls: 0.1786, acc: 93.4968, loss_bbox: 0.2290, loss_mask: 0.2319, loss: 0.6977 2024-05-31 07:27:36,099 - mmdet - INFO - Epoch [6][3800/7330] lr: 1.000e-04, eta: 10:53:19, time: 0.832, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0443, loss_cls: 0.1885, acc: 93.1887, loss_bbox: 0.2390, loss_mask: 0.2369, loss: 0.7277 2024-05-31 07:28:17,121 - mmdet - INFO - Epoch [6][3850/7330] lr: 1.000e-04, eta: 10:52:37, time: 0.820, data_time: 0.037, memory: 18874, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0445, loss_cls: 0.1924, acc: 93.0154, loss_bbox: 0.2412, loss_mask: 0.2404, loss: 0.7373 2024-05-31 07:28:57,952 - mmdet - INFO - Epoch [6][3900/7330] lr: 1.000e-04, eta: 10:51:55, time: 0.817, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0452, loss_cls: 0.1839, acc: 93.3169, loss_bbox: 0.2337, loss_mask: 0.2372, loss: 0.7182 2024-05-31 07:29:38,075 - mmdet - INFO - Epoch [6][3950/7330] lr: 1.000e-04, eta: 10:51:13, time: 0.802, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0437, loss_cls: 0.1866, acc: 93.1133, loss_bbox: 0.2391, loss_mask: 0.2442, loss: 0.7328 2024-05-31 07:30:18,840 - mmdet - INFO - Epoch [6][4000/7330] lr: 1.000e-04, eta: 10:50:31, time: 0.815, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0430, loss_cls: 0.1852, acc: 93.2395, loss_bbox: 0.2299, loss_mask: 0.2372, loss: 0.7138 2024-05-31 07:31:02,702 - mmdet - INFO - Epoch [6][4050/7330] lr: 1.000e-04, eta: 10:49:53, time: 0.877, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0450, loss_cls: 0.1950, acc: 92.8315, loss_bbox: 0.2455, loss_mask: 0.2420, loss: 0.7480 2024-05-31 07:31:44,074 - mmdet - INFO - Epoch [6][4100/7330] lr: 1.000e-04, eta: 10:49:12, time: 0.827, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0519, loss_cls: 0.1934, acc: 92.8408, loss_bbox: 0.2479, loss_mask: 0.2452, loss: 0.7603 2024-05-31 07:32:24,382 - mmdet - INFO - Epoch [6][4150/7330] lr: 1.000e-04, eta: 10:48:29, time: 0.806, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0449, loss_cls: 0.1872, acc: 93.2542, loss_bbox: 0.2403, loss_mask: 0.2447, loss: 0.7356 2024-05-31 07:33:04,757 - mmdet - INFO - Epoch [6][4200/7330] lr: 1.000e-04, eta: 10:47:47, time: 0.807, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0422, loss_cls: 0.1808, acc: 93.2991, loss_bbox: 0.2289, loss_mask: 0.2343, loss: 0.7047 2024-05-31 07:33:45,826 - mmdet - INFO - Epoch [6][4250/7330] lr: 1.000e-04, eta: 10:47:06, time: 0.821, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0457, loss_cls: 0.1880, acc: 93.1968, loss_bbox: 0.2377, loss_mask: 0.2377, loss: 0.7281 2024-05-31 07:34:33,930 - mmdet - INFO - Epoch [6][4300/7330] lr: 1.000e-04, eta: 10:46:32, time: 0.962, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0452, loss_cls: 0.1895, acc: 93.0288, loss_bbox: 0.2383, loss_mask: 0.2430, loss: 0.7368 2024-05-31 07:35:14,747 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 07:35:14,748 - mmdet - INFO - Epoch [6][4350/7330] lr: 1.000e-04, eta: 10:45:50, time: 0.816, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0456, loss_cls: 0.1803, acc: 93.2844, loss_bbox: 0.2389, loss_mask: 0.2401, loss: 0.7239 2024-05-31 07:36:01,361 - mmdet - INFO - Epoch [6][4400/7330] lr: 1.000e-04, eta: 10:45:15, time: 0.932, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0445, loss_cls: 0.1839, acc: 93.2434, loss_bbox: 0.2327, loss_mask: 0.2404, loss: 0.7189 2024-05-31 07:36:41,595 - mmdet - INFO - Epoch [6][4450/7330] lr: 1.000e-04, eta: 10:44:33, time: 0.805, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0442, loss_cls: 0.1816, acc: 93.3596, loss_bbox: 0.2315, loss_mask: 0.2369, loss: 0.7127 2024-05-31 07:37:22,242 - mmdet - INFO - Epoch [6][4500/7330] lr: 1.000e-04, eta: 10:43:51, time: 0.813, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0473, loss_cls: 0.1916, acc: 92.9585, loss_bbox: 0.2438, loss_mask: 0.2427, loss: 0.7450 2024-05-31 07:38:05,470 - mmdet - INFO - Epoch [6][4550/7330] lr: 1.000e-04, eta: 10:43:12, time: 0.865, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0456, loss_cls: 0.1832, acc: 93.2959, loss_bbox: 0.2349, loss_mask: 0.2377, loss: 0.7183 2024-05-31 07:38:46,336 - mmdet - INFO - Epoch [6][4600/7330] lr: 1.000e-04, eta: 10:42:30, time: 0.817, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0447, loss_cls: 0.1912, acc: 93.0249, loss_bbox: 0.2445, loss_mask: 0.2372, loss: 0.7366 2024-05-31 07:39:29,100 - mmdet - INFO - Epoch [6][4650/7330] lr: 1.000e-04, eta: 10:41:51, time: 0.855, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0434, loss_cls: 0.1816, acc: 93.2332, loss_bbox: 0.2349, loss_mask: 0.2328, loss: 0.7108 2024-05-31 07:40:09,379 - mmdet - INFO - Epoch [6][4700/7330] lr: 1.000e-04, eta: 10:41:08, time: 0.806, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0450, loss_cls: 0.1868, acc: 93.1636, loss_bbox: 0.2382, loss_mask: 0.2332, loss: 0.7226 2024-05-31 07:40:49,492 - mmdet - INFO - Epoch [6][4750/7330] lr: 1.000e-04, eta: 10:40:26, time: 0.802, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0423, loss_cls: 0.1758, acc: 93.6213, loss_bbox: 0.2193, loss_mask: 0.2264, loss: 0.6802 2024-05-31 07:41:29,859 - mmdet - INFO - Epoch [6][4800/7330] lr: 1.000e-04, eta: 10:39:43, time: 0.807, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0422, loss_cls: 0.1876, acc: 93.0459, loss_bbox: 0.2393, loss_mask: 0.2370, loss: 0.7235 2024-05-31 07:42:10,756 - mmdet - INFO - Epoch [6][4850/7330] lr: 1.000e-04, eta: 10:39:02, time: 0.818, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0461, loss_cls: 0.1852, acc: 93.2283, loss_bbox: 0.2406, loss_mask: 0.2385, loss: 0.7290 2024-05-31 07:42:51,418 - mmdet - INFO - Epoch [6][4900/7330] lr: 1.000e-04, eta: 10:38:20, time: 0.813, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0440, loss_cls: 0.1912, acc: 92.9102, loss_bbox: 0.2398, loss_mask: 0.2365, loss: 0.7293 2024-05-31 07:43:31,784 - mmdet - INFO - Epoch [6][4950/7330] lr: 1.000e-04, eta: 10:37:38, time: 0.807, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0459, loss_cls: 0.1952, acc: 92.8020, loss_bbox: 0.2512, loss_mask: 0.2470, loss: 0.7586 2024-05-31 07:44:11,805 - mmdet - INFO - Epoch [6][5000/7330] lr: 1.000e-04, eta: 10:36:55, time: 0.800, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0416, loss_cls: 0.1732, acc: 93.6082, loss_bbox: 0.2243, loss_mask: 0.2316, loss: 0.6869 2024-05-31 07:44:52,689 - mmdet - INFO - Epoch [6][5050/7330] lr: 1.000e-04, eta: 10:36:13, time: 0.818, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0489, loss_cls: 0.1918, acc: 92.8416, loss_bbox: 0.2492, loss_mask: 0.2447, loss: 0.7545 2024-05-31 07:45:35,106 - mmdet - INFO - Epoch [6][5100/7330] lr: 1.000e-04, eta: 10:35:33, time: 0.848, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0434, loss_cls: 0.1813, acc: 93.2534, loss_bbox: 0.2307, loss_mask: 0.2346, loss: 0.7063 2024-05-31 07:46:15,939 - mmdet - INFO - Epoch [6][5150/7330] lr: 1.000e-04, eta: 10:34:51, time: 0.817, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0432, loss_cls: 0.1874, acc: 93.0923, loss_bbox: 0.2366, loss_mask: 0.2397, loss: 0.7245 2024-05-31 07:46:56,003 - mmdet - INFO - Epoch [6][5200/7330] lr: 1.000e-04, eta: 10:34:09, time: 0.801, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0426, loss_cls: 0.1757, acc: 93.6685, loss_bbox: 0.2272, loss_mask: 0.2298, loss: 0.6935 2024-05-31 07:47:36,417 - mmdet - INFO - Epoch [6][5250/7330] lr: 1.000e-04, eta: 10:33:27, time: 0.808, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0432, loss_cls: 0.1846, acc: 93.3240, loss_bbox: 0.2343, loss_mask: 0.2387, loss: 0.7196 2024-05-31 07:48:16,958 - mmdet - INFO - Epoch [6][5300/7330] lr: 1.000e-04, eta: 10:32:45, time: 0.811, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0490, loss_cls: 0.1845, acc: 93.2144, loss_bbox: 0.2381, loss_mask: 0.2372, loss: 0.7280 2024-05-31 07:49:05,308 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 07:49:05,308 - mmdet - INFO - Epoch [6][5350/7330] lr: 1.000e-04, eta: 10:32:11, time: 0.967, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0456, loss_cls: 0.1868, acc: 93.2659, loss_bbox: 0.2307, loss_mask: 0.2399, loss: 0.7226 2024-05-31 07:49:45,540 - mmdet - INFO - Epoch [6][5400/7330] lr: 1.000e-04, eta: 10:31:29, time: 0.804, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0446, loss_cls: 0.1822, acc: 93.3022, loss_bbox: 0.2325, loss_mask: 0.2355, loss: 0.7120 2024-05-31 07:50:30,639 - mmdet - INFO - Epoch [6][5450/7330] lr: 1.000e-04, eta: 10:30:52, time: 0.902, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0452, loss_cls: 0.1816, acc: 93.4258, loss_bbox: 0.2318, loss_mask: 0.2332, loss: 0.7103 2024-05-31 07:51:10,939 - mmdet - INFO - Epoch [6][5500/7330] lr: 1.000e-04, eta: 10:30:09, time: 0.806, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0435, loss_cls: 0.1846, acc: 93.2856, loss_bbox: 0.2287, loss_mask: 0.2373, loss: 0.7111 2024-05-31 07:51:51,918 - mmdet - INFO - Epoch [6][5550/7330] lr: 1.000e-04, eta: 10:29:28, time: 0.820, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0456, loss_cls: 0.1949, acc: 92.7229, loss_bbox: 0.2427, loss_mask: 0.2388, loss: 0.7410 2024-05-31 07:52:34,721 - mmdet - INFO - Epoch [6][5600/7330] lr: 1.000e-04, eta: 10:28:48, time: 0.856, data_time: 0.072, memory: 18874, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0439, loss_cls: 0.1879, acc: 93.0039, loss_bbox: 0.2429, loss_mask: 0.2377, loss: 0.7312 2024-05-31 07:53:15,244 - mmdet - INFO - Epoch [6][5650/7330] lr: 1.000e-04, eta: 10:28:06, time: 0.810, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0438, loss_cls: 0.1826, acc: 93.3154, loss_bbox: 0.2336, loss_mask: 0.2353, loss: 0.7140 2024-05-31 07:53:56,786 - mmdet - INFO - Epoch [6][5700/7330] lr: 1.000e-04, eta: 10:27:25, time: 0.831, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0408, loss_cls: 0.1716, acc: 93.7561, loss_bbox: 0.2193, loss_mask: 0.2365, loss: 0.6864 2024-05-31 07:54:37,375 - mmdet - INFO - Epoch [6][5750/7330] lr: 1.000e-04, eta: 10:26:43, time: 0.812, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0412, loss_cls: 0.1795, acc: 93.4875, loss_bbox: 0.2249, loss_mask: 0.2334, loss: 0.6968 2024-05-31 07:55:17,742 - mmdet - INFO - Epoch [6][5800/7330] lr: 1.000e-04, eta: 10:26:01, time: 0.807, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0455, loss_cls: 0.1897, acc: 93.0364, loss_bbox: 0.2404, loss_mask: 0.2449, loss: 0.7384 2024-05-31 07:55:57,724 - mmdet - INFO - Epoch [6][5850/7330] lr: 1.000e-04, eta: 10:25:18, time: 0.800, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0442, loss_cls: 0.1863, acc: 93.1907, loss_bbox: 0.2330, loss_mask: 0.2356, loss: 0.7163 2024-05-31 07:56:37,816 - mmdet - INFO - Epoch [6][5900/7330] lr: 1.000e-04, eta: 10:24:36, time: 0.802, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0441, loss_cls: 0.1903, acc: 93.0322, loss_bbox: 0.2357, loss_mask: 0.2381, loss: 0.7249 2024-05-31 07:57:18,173 - mmdet - INFO - Epoch [6][5950/7330] lr: 1.000e-04, eta: 10:23:53, time: 0.807, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0437, loss_cls: 0.1798, acc: 93.3977, loss_bbox: 0.2296, loss_mask: 0.2295, loss: 0.6993 2024-05-31 07:57:58,922 - mmdet - INFO - Epoch [6][6000/7330] lr: 1.000e-04, eta: 10:23:12, time: 0.815, data_time: 0.073, memory: 18874, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0474, loss_cls: 0.1923, acc: 92.9812, loss_bbox: 0.2417, loss_mask: 0.2385, loss: 0.7390 2024-05-31 07:58:39,290 - mmdet - INFO - Epoch [6][6050/7330] lr: 1.000e-04, eta: 10:22:29, time: 0.807, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0396, loss_cls: 0.1717, acc: 93.6743, loss_bbox: 0.2207, loss_mask: 0.2300, loss: 0.6784 2024-05-31 07:59:19,409 - mmdet - INFO - Epoch [6][6100/7330] lr: 1.000e-04, eta: 10:21:47, time: 0.802, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0423, loss_cls: 0.1833, acc: 93.2412, loss_bbox: 0.2336, loss_mask: 0.2289, loss: 0.7060 2024-05-31 08:00:03,967 - mmdet - INFO - Epoch [6][6150/7330] lr: 1.000e-04, eta: 10:21:09, time: 0.891, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0466, loss_cls: 0.1885, acc: 93.1716, loss_bbox: 0.2375, loss_mask: 0.2338, loss: 0.7244 2024-05-31 08:00:44,366 - mmdet - INFO - Epoch [6][6200/7330] lr: 1.000e-04, eta: 10:20:27, time: 0.808, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0469, loss_cls: 0.1887, acc: 93.0374, loss_bbox: 0.2394, loss_mask: 0.2412, loss: 0.7355 2024-05-31 08:01:24,719 - mmdet - INFO - Epoch [6][6250/7330] lr: 1.000e-04, eta: 10:19:45, time: 0.807, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0455, loss_cls: 0.1947, acc: 92.9570, loss_bbox: 0.2454, loss_mask: 0.2450, loss: 0.7504 2024-05-31 08:02:04,905 - mmdet - INFO - Epoch [6][6300/7330] lr: 1.000e-04, eta: 10:19:02, time: 0.804, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0425, loss_cls: 0.1862, acc: 93.1733, loss_bbox: 0.2374, loss_mask: 0.2325, loss: 0.7160 2024-05-31 08:02:45,163 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 08:02:45,164 - mmdet - INFO - Epoch [6][6350/7330] lr: 1.000e-04, eta: 10:18:20, time: 0.805, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0415, loss_cls: 0.1834, acc: 93.3750, loss_bbox: 0.2308, loss_mask: 0.2405, loss: 0.7142 2024-05-31 08:03:30,747 - mmdet - INFO - Epoch [6][6400/7330] lr: 1.000e-04, eta: 10:17:43, time: 0.912, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0453, loss_cls: 0.1899, acc: 92.9778, loss_bbox: 0.2443, loss_mask: 0.2436, loss: 0.7430 2024-05-31 08:04:13,896 - mmdet - INFO - Epoch [6][6450/7330] lr: 1.000e-04, eta: 10:17:04, time: 0.863, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0440, loss_cls: 0.1793, acc: 93.3982, loss_bbox: 0.2306, loss_mask: 0.2344, loss: 0.7059 2024-05-31 08:05:00,170 - mmdet - INFO - Epoch [6][6500/7330] lr: 1.000e-04, eta: 10:16:28, time: 0.925, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0464, loss_cls: 0.1854, acc: 93.1497, loss_bbox: 0.2359, loss_mask: 0.2399, loss: 0.7272 2024-05-31 08:05:40,917 - mmdet - INFO - Epoch [6][6550/7330] lr: 1.000e-04, eta: 10:15:46, time: 0.815, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0421, loss_cls: 0.1771, acc: 93.3752, loss_bbox: 0.2267, loss_mask: 0.2280, loss: 0.6918 2024-05-31 08:06:21,366 - mmdet - INFO - Epoch [6][6600/7330] lr: 1.000e-04, eta: 10:15:04, time: 0.809, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0444, loss_cls: 0.1886, acc: 93.1050, loss_bbox: 0.2293, loss_mask: 0.2345, loss: 0.7171 2024-05-31 08:07:04,150 - mmdet - INFO - Epoch [6][6650/7330] lr: 1.000e-04, eta: 10:14:24, time: 0.856, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0457, loss_cls: 0.1861, acc: 93.0845, loss_bbox: 0.2427, loss_mask: 0.2434, loss: 0.7365 2024-05-31 08:07:44,495 - mmdet - INFO - Epoch [6][6700/7330] lr: 1.000e-04, eta: 10:13:42, time: 0.807, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0427, loss_cls: 0.1846, acc: 93.2229, loss_bbox: 0.2360, loss_mask: 0.2401, loss: 0.7213 2024-05-31 08:08:26,996 - mmdet - INFO - Epoch [6][6750/7330] lr: 1.000e-04, eta: 10:13:02, time: 0.850, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0459, loss_cls: 0.1870, acc: 93.0964, loss_bbox: 0.2364, loss_mask: 0.2404, loss: 0.7280 2024-05-31 08:09:07,102 - mmdet - INFO - Epoch [6][6800/7330] lr: 1.000e-04, eta: 10:12:20, time: 0.802, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0462, loss_cls: 0.1880, acc: 93.1716, loss_bbox: 0.2339, loss_mask: 0.2391, loss: 0.7275 2024-05-31 08:09:48,256 - mmdet - INFO - Epoch [6][6850/7330] lr: 1.000e-04, eta: 10:11:38, time: 0.823, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0466, loss_cls: 0.1923, acc: 92.8064, loss_bbox: 0.2442, loss_mask: 0.2397, loss: 0.7415 2024-05-31 08:10:28,909 - mmdet - INFO - Epoch [6][6900/7330] lr: 1.000e-04, eta: 10:10:56, time: 0.813, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0465, loss_cls: 0.1915, acc: 92.9648, loss_bbox: 0.2446, loss_mask: 0.2423, loss: 0.7441 2024-05-31 08:11:09,553 - mmdet - INFO - Epoch [6][6950/7330] lr: 1.000e-04, eta: 10:10:14, time: 0.813, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0443, loss_cls: 0.1883, acc: 92.9214, loss_bbox: 0.2425, loss_mask: 0.2378, loss: 0.7309 2024-05-31 08:11:49,813 - mmdet - INFO - Epoch [6][7000/7330] lr: 1.000e-04, eta: 10:09:32, time: 0.805, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0438, loss_cls: 0.1818, acc: 93.3113, loss_bbox: 0.2301, loss_mask: 0.2321, loss: 0.7067 2024-05-31 08:12:30,329 - mmdet - INFO - Epoch [6][7050/7330] lr: 1.000e-04, eta: 10:08:50, time: 0.810, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0459, loss_cls: 0.1937, acc: 92.9766, loss_bbox: 0.2438, loss_mask: 0.2419, loss: 0.7439 2024-05-31 08:13:11,246 - mmdet - INFO - Epoch [6][7100/7330] lr: 1.000e-04, eta: 10:08:08, time: 0.818, data_time: 0.081, memory: 18874, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0470, loss_cls: 0.1918, acc: 93.0911, loss_bbox: 0.2389, loss_mask: 0.2383, loss: 0.7337 2024-05-31 08:13:51,707 - mmdet - INFO - Epoch [6][7150/7330] lr: 1.000e-04, eta: 10:07:26, time: 0.809, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0453, loss_cls: 0.1861, acc: 93.2776, loss_bbox: 0.2346, loss_mask: 0.2348, loss: 0.7208 2024-05-31 08:14:32,336 - mmdet - INFO - Epoch [6][7200/7330] lr: 1.000e-04, eta: 10:06:44, time: 0.813, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0449, loss_cls: 0.1872, acc: 93.0410, loss_bbox: 0.2395, loss_mask: 0.2391, loss: 0.7289 2024-05-31 08:15:14,913 - mmdet - INFO - Epoch [6][7250/7330] lr: 1.000e-04, eta: 10:06:05, time: 0.852, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0439, loss_cls: 0.1823, acc: 93.3047, loss_bbox: 0.2322, loss_mask: 0.2346, loss: 0.7105 2024-05-31 08:15:55,756 - mmdet - INFO - Epoch [6][7300/7330] lr: 1.000e-04, eta: 10:05:23, time: 0.817, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0443, loss_cls: 0.1830, acc: 93.2271, loss_bbox: 0.2330, loss_mask: 0.2309, loss: 0.7084 2024-05-31 08:16:20,365 - mmdet - INFO - Saving checkpoint at 6 epochs 2024-05-31 08:18:12,451 - mmdet - INFO - Evaluating bbox... 2024-05-31 08:18:35,741 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.455 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.691 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.503 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.279 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.501 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.616 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.576 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.576 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.576 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.384 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.624 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.740 2024-05-31 08:18:35,741 - mmdet - INFO - Evaluating segm... 2024-05-31 08:19:02,380 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.412 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.657 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.439 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.196 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.448 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.632 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.526 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.526 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.526 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.317 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.576 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.716 2024-05-31 08:19:02,879 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 08:19:02,881 - mmdet - INFO - Epoch(val) [6][625] bbox_mAP: 0.4550, bbox_mAP_50: 0.6910, bbox_mAP_75: 0.5030, bbox_mAP_s: 0.2790, bbox_mAP_m: 0.5010, bbox_mAP_l: 0.6160, bbox_mAP_copypaste: 0.455 0.691 0.503 0.279 0.501 0.616, segm_mAP: 0.4120, segm_mAP_50: 0.6570, segm_mAP_75: 0.4390, segm_mAP_s: 0.1960, segm_mAP_m: 0.4480, segm_mAP_l: 0.6320, segm_mAP_copypaste: 0.412 0.657 0.439 0.196 0.448 0.632 2024-05-31 08:19:50,911 - mmdet - INFO - Epoch [7][50/7330] lr: 1.000e-04, eta: 10:03:59, time: 0.960, data_time: 0.187, memory: 18874, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0408, loss_cls: 0.1733, acc: 93.5696, loss_bbox: 0.2272, loss_mask: 0.2289, loss: 0.6863 2024-05-31 08:20:36,727 - mmdet - INFO - Epoch [7][100/7330] lr: 1.000e-04, eta: 10:03:22, time: 0.916, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0399, loss_cls: 0.1730, acc: 93.5710, loss_bbox: 0.2257, loss_mask: 0.2264, loss: 0.6814 2024-05-31 08:21:17,929 - mmdet - INFO - Epoch [7][150/7330] lr: 1.000e-04, eta: 10:02:41, time: 0.824, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0439, loss_cls: 0.1745, acc: 93.5408, loss_bbox: 0.2264, loss_mask: 0.2297, loss: 0.6905 2024-05-31 08:21:58,913 - mmdet - INFO - Epoch [7][200/7330] lr: 1.000e-04, eta: 10:01:59, time: 0.820, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0440, loss_cls: 0.1799, acc: 93.2712, loss_bbox: 0.2361, loss_mask: 0.2398, loss: 0.7167 2024-05-31 08:22:40,224 - mmdet - INFO - Epoch [7][250/7330] lr: 1.000e-04, eta: 10:01:18, time: 0.826, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0454, loss_cls: 0.1840, acc: 93.1826, loss_bbox: 0.2346, loss_mask: 0.2309, loss: 0.7116 2024-05-31 08:23:21,198 - mmdet - INFO - Epoch [7][300/7330] lr: 1.000e-04, eta: 10:00:37, time: 0.820, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0426, loss_cls: 0.1676, acc: 93.8118, loss_bbox: 0.2239, loss_mask: 0.2262, loss: 0.6770 2024-05-31 08:24:02,658 - mmdet - INFO - Epoch [7][350/7330] lr: 1.000e-04, eta: 9:59:56, time: 0.829, data_time: 0.075, memory: 18874, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0450, loss_cls: 0.1886, acc: 93.1162, loss_bbox: 0.2372, loss_mask: 0.2360, loss: 0.7245 2024-05-31 08:24:43,577 - mmdet - INFO - Epoch [7][400/7330] lr: 1.000e-04, eta: 9:59:14, time: 0.818, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0464, loss_cls: 0.1852, acc: 93.1013, loss_bbox: 0.2363, loss_mask: 0.2344, loss: 0.7203 2024-05-31 08:25:25,417 - mmdet - INFO - Epoch [7][450/7330] lr: 1.000e-04, eta: 9:58:33, time: 0.836, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0444, loss_cls: 0.1810, acc: 93.2571, loss_bbox: 0.2371, loss_mask: 0.2355, loss: 0.7158 2024-05-31 08:26:08,086 - mmdet - INFO - Epoch [7][500/7330] lr: 1.000e-04, eta: 9:57:53, time: 0.854, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0405, loss_cls: 0.1731, acc: 93.5474, loss_bbox: 0.2218, loss_mask: 0.2325, loss: 0.6840 2024-05-31 08:26:48,257 - mmdet - INFO - Epoch [7][550/7330] lr: 1.000e-04, eta: 9:57:11, time: 0.803, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0413, loss_cls: 0.1727, acc: 93.7085, loss_bbox: 0.2212, loss_mask: 0.2339, loss: 0.6847 2024-05-31 08:27:28,879 - mmdet - INFO - Epoch [7][600/7330] lr: 1.000e-04, eta: 9:56:29, time: 0.812, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0437, loss_cls: 0.1756, acc: 93.5352, loss_bbox: 0.2302, loss_mask: 0.2329, loss: 0.6996 2024-05-31 08:28:09,724 - mmdet - INFO - Epoch [7][650/7330] lr: 1.000e-04, eta: 9:55:48, time: 0.817, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0422, loss_cls: 0.1740, acc: 93.5754, loss_bbox: 0.2238, loss_mask: 0.2267, loss: 0.6833 2024-05-31 08:28:51,521 - mmdet - INFO - Epoch [7][700/7330] lr: 1.000e-04, eta: 9:55:07, time: 0.836, data_time: 0.041, memory: 18874, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0409, loss_cls: 0.1646, acc: 93.9536, loss_bbox: 0.2100, loss_mask: 0.2222, loss: 0.6525 2024-05-31 08:29:32,154 - mmdet - INFO - Epoch [7][750/7330] lr: 1.000e-04, eta: 9:54:25, time: 0.813, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0442, loss_cls: 0.1813, acc: 93.2539, loss_bbox: 0.2374, loss_mask: 0.2376, loss: 0.7168 2024-05-31 08:30:12,448 - mmdet - INFO - Epoch [7][800/7330] lr: 1.000e-04, eta: 9:53:43, time: 0.806, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0414, loss_cls: 0.1674, acc: 93.7644, loss_bbox: 0.2235, loss_mask: 0.2311, loss: 0.6790 2024-05-31 08:30:53,933 - mmdet - INFO - Epoch [7][850/7330] lr: 1.000e-04, eta: 9:53:02, time: 0.830, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0463, loss_cls: 0.1804, acc: 93.2617, loss_bbox: 0.2308, loss_mask: 0.2311, loss: 0.7053 2024-05-31 08:31:34,197 - mmdet - INFO - Epoch [7][900/7330] lr: 1.000e-04, eta: 9:52:20, time: 0.805, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0434, loss_cls: 0.1777, acc: 93.3577, loss_bbox: 0.2302, loss_mask: 0.2384, loss: 0.7057 2024-05-31 08:32:15,279 - mmdet - INFO - Epoch [7][950/7330] lr: 1.000e-04, eta: 9:51:38, time: 0.822, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0442, loss_cls: 0.1880, acc: 93.0156, loss_bbox: 0.2372, loss_mask: 0.2323, loss: 0.7193 2024-05-31 08:32:56,490 - mmdet - INFO - Epoch [7][1000/7330] lr: 1.000e-04, eta: 9:50:57, time: 0.824, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0413, loss_cls: 0.1767, acc: 93.4409, loss_bbox: 0.2265, loss_mask: 0.2310, loss: 0.6922 2024-05-31 08:33:39,814 - mmdet - INFO - Epoch [7][1050/7330] lr: 1.000e-04, eta: 9:50:18, time: 0.866, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0472, loss_cls: 0.1908, acc: 92.9697, loss_bbox: 0.2404, loss_mask: 0.2369, loss: 0.7317 2024-05-31 08:34:21,392 - mmdet - INFO - Epoch [7][1100/7330] lr: 1.000e-04, eta: 9:49:37, time: 0.832, data_time: 0.074, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0447, loss_cls: 0.1741, acc: 93.5183, loss_bbox: 0.2295, loss_mask: 0.2340, loss: 0.6992 2024-05-31 08:35:07,608 - mmdet - INFO - Epoch [7][1150/7330] lr: 1.000e-04, eta: 9:49:00, time: 0.924, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0409, loss_cls: 0.1683, acc: 93.7205, loss_bbox: 0.2231, loss_mask: 0.2339, loss: 0.6821 2024-05-31 08:35:48,344 - mmdet - INFO - Epoch [7][1200/7330] lr: 1.000e-04, eta: 9:48:18, time: 0.815, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0412, loss_cls: 0.1631, acc: 93.9487, loss_bbox: 0.2135, loss_mask: 0.2201, loss: 0.6538 2024-05-31 08:36:29,349 - mmdet - INFO - Epoch [7][1250/7330] lr: 1.000e-04, eta: 9:47:37, time: 0.820, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0442, loss_cls: 0.1811, acc: 93.3530, loss_bbox: 0.2281, loss_mask: 0.2300, loss: 0.7011 2024-05-31 08:37:15,449 - mmdet - INFO - Epoch [7][1300/7330] lr: 1.000e-04, eta: 9:47:00, time: 0.922, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0454, loss_cls: 0.1731, acc: 93.6052, loss_bbox: 0.2290, loss_mask: 0.2290, loss: 0.6933 2024-05-31 08:37:55,883 - mmdet - INFO - Epoch [7][1350/7330] lr: 1.000e-04, eta: 9:46:18, time: 0.809, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0397, loss_cls: 0.1724, acc: 93.6335, loss_bbox: 0.2238, loss_mask: 0.2298, loss: 0.6824 2024-05-31 08:38:39,527 - mmdet - INFO - Epoch [7][1400/7330] lr: 1.000e-04, eta: 9:45:39, time: 0.873, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0450, loss_cls: 0.1897, acc: 93.0044, loss_bbox: 0.2410, loss_mask: 0.2330, loss: 0.7270 2024-05-31 08:39:20,888 - mmdet - INFO - Epoch [7][1450/7330] lr: 1.000e-04, eta: 9:44:58, time: 0.827, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0450, loss_cls: 0.1845, acc: 93.1572, loss_bbox: 0.2355, loss_mask: 0.2336, loss: 0.7174 2024-05-31 08:40:01,457 - mmdet - INFO - Epoch [7][1500/7330] lr: 1.000e-04, eta: 9:44:16, time: 0.811, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0421, loss_cls: 0.1824, acc: 93.2178, loss_bbox: 0.2332, loss_mask: 0.2342, loss: 0.7087 2024-05-31 08:40:43,893 - mmdet - INFO - Epoch [7][1550/7330] lr: 1.000e-04, eta: 9:43:36, time: 0.849, data_time: 0.041, memory: 18874, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0413, loss_cls: 0.1776, acc: 93.4182, loss_bbox: 0.2287, loss_mask: 0.2321, loss: 0.6970 2024-05-31 08:41:25,237 - mmdet - INFO - Epoch [7][1600/7330] lr: 1.000e-04, eta: 9:42:55, time: 0.827, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0444, loss_cls: 0.1864, acc: 92.9878, loss_bbox: 0.2392, loss_mask: 0.2330, loss: 0.7215 2024-05-31 08:42:06,369 - mmdet - INFO - Epoch [7][1650/7330] lr: 1.000e-04, eta: 9:42:13, time: 0.823, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0432, loss_cls: 0.1760, acc: 93.4443, loss_bbox: 0.2253, loss_mask: 0.2303, loss: 0.6919 2024-05-31 08:42:46,723 - mmdet - INFO - Epoch [7][1700/7330] lr: 1.000e-04, eta: 9:41:31, time: 0.807, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0413, loss_cls: 0.1787, acc: 93.3547, loss_bbox: 0.2291, loss_mask: 0.2333, loss: 0.6978 2024-05-31 08:43:29,780 - mmdet - INFO - Epoch [7][1750/7330] lr: 1.000e-04, eta: 9:40:52, time: 0.861, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0446, loss_cls: 0.1785, acc: 93.4224, loss_bbox: 0.2290, loss_mask: 0.2333, loss: 0.7024 2024-05-31 08:44:10,863 - mmdet - INFO - Epoch [7][1800/7330] lr: 1.000e-04, eta: 9:40:10, time: 0.821, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0449, loss_cls: 0.1858, acc: 93.1182, loss_bbox: 0.2392, loss_mask: 0.2344, loss: 0.7219 2024-05-31 08:44:51,470 - mmdet - INFO - Epoch [7][1850/7330] lr: 1.000e-04, eta: 9:39:28, time: 0.813, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0437, loss_cls: 0.1817, acc: 93.2805, loss_bbox: 0.2322, loss_mask: 0.2393, loss: 0.7125 2024-05-31 08:45:32,024 - mmdet - INFO - Epoch [7][1900/7330] lr: 1.000e-04, eta: 9:38:46, time: 0.811, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0430, loss_cls: 0.1761, acc: 93.4895, loss_bbox: 0.2265, loss_mask: 0.2303, loss: 0.6936 2024-05-31 08:46:12,513 - mmdet - INFO - Epoch [7][1950/7330] lr: 1.000e-04, eta: 9:38:04, time: 0.810, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0413, loss_cls: 0.1766, acc: 93.5513, loss_bbox: 0.2212, loss_mask: 0.2276, loss: 0.6843 2024-05-31 08:46:53,218 - mmdet - INFO - Epoch [7][2000/7330] lr: 1.000e-04, eta: 9:37:23, time: 0.814, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0409, loss_cls: 0.1757, acc: 93.5640, loss_bbox: 0.2263, loss_mask: 0.2293, loss: 0.6882 2024-05-31 08:47:34,110 - mmdet - INFO - Epoch [7][2050/7330] lr: 1.000e-04, eta: 9:36:41, time: 0.818, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0429, loss_cls: 0.1818, acc: 93.2610, loss_bbox: 0.2337, loss_mask: 0.2358, loss: 0.7117 2024-05-31 08:48:16,924 - mmdet - INFO - Epoch [7][2100/7330] lr: 1.000e-04, eta: 9:36:01, time: 0.856, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0407, loss_cls: 0.1730, acc: 93.6187, loss_bbox: 0.2234, loss_mask: 0.2270, loss: 0.6809 2024-05-31 08:48:58,051 - mmdet - INFO - Epoch [7][2150/7330] lr: 1.000e-04, eta: 9:35:20, time: 0.823, data_time: 0.077, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0430, loss_cls: 0.1773, acc: 93.4219, loss_bbox: 0.2273, loss_mask: 0.2308, loss: 0.6952 2024-05-31 08:49:43,549 - mmdet - INFO - Epoch [7][2200/7330] lr: 1.000e-04, eta: 9:34:42, time: 0.910, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0424, loss_cls: 0.1746, acc: 93.5000, loss_bbox: 0.2238, loss_mask: 0.2286, loss: 0.6870 2024-05-31 08:50:24,217 - mmdet - INFO - Epoch [7][2250/7330] lr: 1.000e-04, eta: 9:34:01, time: 0.813, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0410, loss_cls: 0.1778, acc: 93.3850, loss_bbox: 0.2327, loss_mask: 0.2329, loss: 0.7005 2024-05-31 08:51:04,665 - mmdet - INFO - Epoch [7][2300/7330] lr: 1.000e-04, eta: 9:33:19, time: 0.809, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0442, loss_cls: 0.1782, acc: 93.3501, loss_bbox: 0.2297, loss_mask: 0.2300, loss: 0.6994 2024-05-31 08:51:52,459 - mmdet - INFO - Epoch [7][2350/7330] lr: 1.000e-04, eta: 9:32:43, time: 0.956, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0418, loss_cls: 0.1804, acc: 93.3970, loss_bbox: 0.2296, loss_mask: 0.2291, loss: 0.6988 2024-05-31 08:52:33,327 - mmdet - INFO - Epoch [7][2400/7330] lr: 1.000e-04, eta: 9:32:02, time: 0.817, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0453, loss_cls: 0.1774, acc: 93.3711, loss_bbox: 0.2294, loss_mask: 0.2350, loss: 0.7053 2024-05-31 08:53:16,087 - mmdet - INFO - Epoch [7][2450/7330] lr: 1.000e-04, eta: 9:31:22, time: 0.855, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0435, loss_cls: 0.1826, acc: 93.2288, loss_bbox: 0.2401, loss_mask: 0.2414, loss: 0.7251 2024-05-31 08:53:56,461 - mmdet - INFO - Epoch [7][2500/7330] lr: 1.000e-04, eta: 9:30:40, time: 0.807, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0428, loss_cls: 0.1742, acc: 93.5320, loss_bbox: 0.2272, loss_mask: 0.2316, loss: 0.6924 2024-05-31 08:54:37,523 - mmdet - INFO - Epoch [7][2550/7330] lr: 1.000e-04, eta: 9:29:58, time: 0.821, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0452, loss_cls: 0.1903, acc: 92.9878, loss_bbox: 0.2438, loss_mask: 0.2377, loss: 0.7353 2024-05-31 08:55:20,153 - mmdet - INFO - Epoch [7][2600/7330] lr: 1.000e-04, eta: 9:29:18, time: 0.853, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0430, loss_cls: 0.1753, acc: 93.4854, loss_bbox: 0.2255, loss_mask: 0.2293, loss: 0.6907 2024-05-31 08:56:00,650 - mmdet - INFO - Epoch [7][2650/7330] lr: 1.000e-04, eta: 9:28:36, time: 0.810, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0437, loss_cls: 0.1797, acc: 93.3403, loss_bbox: 0.2307, loss_mask: 0.2328, loss: 0.7031 2024-05-31 08:56:41,138 - mmdet - INFO - Epoch [7][2700/7330] lr: 1.000e-04, eta: 9:27:54, time: 0.810, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0444, loss_cls: 0.1870, acc: 93.0020, loss_bbox: 0.2428, loss_mask: 0.2340, loss: 0.7251 2024-05-31 08:57:21,895 - mmdet - INFO - Epoch [7][2750/7330] lr: 1.000e-04, eta: 9:27:12, time: 0.815, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0412, loss_cls: 0.1682, acc: 93.7380, loss_bbox: 0.2235, loss_mask: 0.2260, loss: 0.6750 2024-05-31 08:58:05,080 - mmdet - INFO - Epoch [7][2800/7330] lr: 1.000e-04, eta: 9:26:33, time: 0.864, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0437, loss_cls: 0.1787, acc: 93.3335, loss_bbox: 0.2331, loss_mask: 0.2311, loss: 0.7053 2024-05-31 08:58:45,271 - mmdet - INFO - Epoch [7][2850/7330] lr: 1.000e-04, eta: 9:25:51, time: 0.804, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0411, loss_cls: 0.1709, acc: 93.6904, loss_bbox: 0.2219, loss_mask: 0.2249, loss: 0.6758 2024-05-31 08:59:26,530 - mmdet - INFO - Epoch [7][2900/7330] lr: 1.000e-04, eta: 9:25:09, time: 0.825, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0441, loss_cls: 0.1798, acc: 93.3386, loss_bbox: 0.2296, loss_mask: 0.2327, loss: 0.7053 2024-05-31 09:00:07,291 - mmdet - INFO - Epoch [7][2950/7330] lr: 1.000e-04, eta: 9:24:28, time: 0.815, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0429, loss_cls: 0.1803, acc: 93.2837, loss_bbox: 0.2296, loss_mask: 0.2299, loss: 0.6993 2024-05-31 09:00:47,893 - mmdet - INFO - Epoch [7][3000/7330] lr: 1.000e-04, eta: 9:23:46, time: 0.812, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0426, loss_cls: 0.1763, acc: 93.4395, loss_bbox: 0.2235, loss_mask: 0.2319, loss: 0.6910 2024-05-31 09:01:28,757 - mmdet - INFO - Epoch [7][3050/7330] lr: 1.000e-04, eta: 9:23:04, time: 0.817, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0420, loss_cls: 0.1772, acc: 93.4531, loss_bbox: 0.2296, loss_mask: 0.2361, loss: 0.7018 2024-05-31 09:02:09,452 - mmdet - INFO - Epoch [7][3100/7330] lr: 1.000e-04, eta: 9:22:22, time: 0.814, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0425, loss_cls: 0.1740, acc: 93.5251, loss_bbox: 0.2269, loss_mask: 0.2338, loss: 0.6937 2024-05-31 09:02:52,059 - mmdet - INFO - Epoch [7][3150/7330] lr: 1.000e-04, eta: 9:21:42, time: 0.852, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0442, loss_cls: 0.1748, acc: 93.4836, loss_bbox: 0.2250, loss_mask: 0.2314, loss: 0.6940 2024-05-31 09:03:32,660 - mmdet - INFO - Epoch [7][3200/7330] lr: 1.000e-04, eta: 9:21:00, time: 0.812, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0427, loss_cls: 0.1852, acc: 93.1663, loss_bbox: 0.2384, loss_mask: 0.2302, loss: 0.7135 2024-05-31 09:04:18,448 - mmdet - INFO - Epoch [7][3250/7330] lr: 1.000e-04, eta: 9:20:23, time: 0.916, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0451, loss_cls: 0.1853, acc: 93.0977, loss_bbox: 0.2374, loss_mask: 0.2311, loss: 0.7176 2024-05-31 09:04:58,849 - mmdet - INFO - Epoch [7][3300/7330] lr: 1.000e-04, eta: 9:19:41, time: 0.808, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0412, loss_cls: 0.1792, acc: 93.3325, loss_bbox: 0.2277, loss_mask: 0.2314, loss: 0.6974 2024-05-31 09:05:40,106 - mmdet - INFO - Epoch [7][3350/7330] lr: 1.000e-04, eta: 9:19:00, time: 0.825, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0453, loss_cls: 0.1865, acc: 93.0791, loss_bbox: 0.2402, loss_mask: 0.2373, loss: 0.7280 2024-05-31 09:06:26,276 - mmdet - INFO - Epoch [7][3400/7330] lr: 1.000e-04, eta: 9:18:23, time: 0.923, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0458, loss_cls: 0.1800, acc: 93.3362, loss_bbox: 0.2338, loss_mask: 0.2330, loss: 0.7105 2024-05-31 09:07:07,324 - mmdet - INFO - Epoch [7][3450/7330] lr: 1.000e-04, eta: 9:17:41, time: 0.821, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0438, loss_cls: 0.1816, acc: 93.3411, loss_bbox: 0.2310, loss_mask: 0.2330, loss: 0.7076 2024-05-31 09:07:50,166 - mmdet - INFO - Epoch [7][3500/7330] lr: 1.000e-04, eta: 9:17:01, time: 0.857, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0432, loss_cls: 0.1862, acc: 93.1135, loss_bbox: 0.2343, loss_mask: 0.2300, loss: 0.7103 2024-05-31 09:08:30,731 - mmdet - INFO - Epoch [7][3550/7330] lr: 1.000e-04, eta: 9:16:19, time: 0.811, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0408, loss_cls: 0.1722, acc: 93.5764, loss_bbox: 0.2244, loss_mask: 0.2258, loss: 0.6795 2024-05-31 09:09:11,850 - mmdet - INFO - Epoch [7][3600/7330] lr: 1.000e-04, eta: 9:15:38, time: 0.822, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0443, loss_cls: 0.1818, acc: 93.2551, loss_bbox: 0.2346, loss_mask: 0.2346, loss: 0.7116 2024-05-31 09:09:54,903 - mmdet - INFO - Epoch [7][3650/7330] lr: 1.000e-04, eta: 9:14:58, time: 0.861, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0423, loss_cls: 0.1786, acc: 93.3940, loss_bbox: 0.2290, loss_mask: 0.2273, loss: 0.6957 2024-05-31 09:10:35,928 - mmdet - INFO - Epoch [7][3700/7330] lr: 1.000e-04, eta: 9:14:17, time: 0.820, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0436, loss_cls: 0.1733, acc: 93.4329, loss_bbox: 0.2229, loss_mask: 0.2277, loss: 0.6844 2024-05-31 09:11:16,439 - mmdet - INFO - Epoch [7][3750/7330] lr: 1.000e-04, eta: 9:13:35, time: 0.810, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0435, loss_cls: 0.1783, acc: 93.3252, loss_bbox: 0.2330, loss_mask: 0.2336, loss: 0.7046 2024-05-31 09:11:56,946 - mmdet - INFO - Epoch [7][3800/7330] lr: 1.000e-04, eta: 9:12:53, time: 0.810, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0451, loss_cls: 0.1838, acc: 93.1907, loss_bbox: 0.2338, loss_mask: 0.2405, loss: 0.7227 2024-05-31 09:12:40,233 - mmdet - INFO - Epoch [7][3850/7330] lr: 1.000e-04, eta: 9:12:13, time: 0.866, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0458, loss_cls: 0.1817, acc: 93.3984, loss_bbox: 0.2332, loss_mask: 0.2374, loss: 0.7162 2024-05-31 09:13:20,879 - mmdet - INFO - Epoch [7][3900/7330] lr: 1.000e-04, eta: 9:11:31, time: 0.813, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0424, loss_cls: 0.1792, acc: 93.4866, loss_bbox: 0.2218, loss_mask: 0.2347, loss: 0.6958 2024-05-31 09:14:01,793 - mmdet - INFO - Epoch [7][3950/7330] lr: 1.000e-04, eta: 9:10:50, time: 0.818, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0433, loss_cls: 0.1757, acc: 93.4136, loss_bbox: 0.2305, loss_mask: 0.2391, loss: 0.7048 2024-05-31 09:14:43,586 - mmdet - INFO - Epoch [7][4000/7330] lr: 1.000e-04, eta: 9:10:09, time: 0.835, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0437, loss_cls: 0.1850, acc: 93.1025, loss_bbox: 0.2370, loss_mask: 0.2369, loss: 0.7220 2024-05-31 09:15:23,952 - mmdet - INFO - Epoch [7][4050/7330] lr: 1.000e-04, eta: 9:09:27, time: 0.808, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0436, loss_cls: 0.1812, acc: 93.3809, loss_bbox: 0.2328, loss_mask: 0.2339, loss: 0.7110 2024-05-31 09:16:04,991 - mmdet - INFO - Epoch [7][4100/7330] lr: 1.000e-04, eta: 9:08:45, time: 0.821, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0411, loss_cls: 0.1768, acc: 93.4807, loss_bbox: 0.2273, loss_mask: 0.2255, loss: 0.6871 2024-05-31 09:16:46,060 - mmdet - INFO - Epoch [7][4150/7330] lr: 1.000e-04, eta: 9:08:04, time: 0.821, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0409, loss_cls: 0.1809, acc: 93.2844, loss_bbox: 0.2291, loss_mask: 0.2304, loss: 0.6978 2024-05-31 09:17:28,714 - mmdet - INFO - Epoch [7][4200/7330] lr: 1.000e-04, eta: 9:07:24, time: 0.853, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0452, loss_cls: 0.1856, acc: 93.1658, loss_bbox: 0.2335, loss_mask: 0.2354, loss: 0.7180 2024-05-31 09:18:09,689 - mmdet - INFO - Epoch [7][4250/7330] lr: 1.000e-04, eta: 9:06:42, time: 0.819, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0437, loss_cls: 0.1854, acc: 93.0417, loss_bbox: 0.2354, loss_mask: 0.2329, loss: 0.7155 2024-05-31 09:18:53,312 - mmdet - INFO - Epoch [7][4300/7330] lr: 1.000e-04, eta: 9:06:03, time: 0.872, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0440, loss_cls: 0.1762, acc: 93.3994, loss_bbox: 0.2249, loss_mask: 0.2301, loss: 0.6907 2024-05-31 09:19:35,863 - mmdet - INFO - Epoch [7][4350/7330] lr: 1.000e-04, eta: 9:05:23, time: 0.851, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0435, loss_cls: 0.1862, acc: 93.0793, loss_bbox: 0.2416, loss_mask: 0.2343, loss: 0.7222 2024-05-31 09:20:16,511 - mmdet - INFO - Epoch [7][4400/7330] lr: 1.000e-04, eta: 9:04:41, time: 0.813, data_time: 0.040, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0436, loss_cls: 0.1880, acc: 93.1074, loss_bbox: 0.2380, loss_mask: 0.2402, loss: 0.7265 2024-05-31 09:21:02,407 - mmdet - INFO - Epoch [7][4450/7330] lr: 1.000e-04, eta: 9:04:03, time: 0.918, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0438, loss_cls: 0.1809, acc: 93.2891, loss_bbox: 0.2307, loss_mask: 0.2347, loss: 0.7080 2024-05-31 09:21:43,139 - mmdet - INFO - Epoch [7][4500/7330] lr: 1.000e-04, eta: 9:03:21, time: 0.815, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0403, loss_cls: 0.1756, acc: 93.4739, loss_bbox: 0.2233, loss_mask: 0.2256, loss: 0.6814 2024-05-31 09:22:26,025 - mmdet - INFO - Epoch [7][4550/7330] lr: 1.000e-04, eta: 9:02:41, time: 0.858, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0416, loss_cls: 0.1821, acc: 93.1633, loss_bbox: 0.2334, loss_mask: 0.2309, loss: 0.7050 2024-05-31 09:23:06,599 - mmdet - INFO - Epoch [7][4600/7330] lr: 1.000e-04, eta: 9:02:00, time: 0.811, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0430, loss_cls: 0.1722, acc: 93.6179, loss_bbox: 0.2253, loss_mask: 0.2278, loss: 0.6843 2024-05-31 09:23:47,742 - mmdet - INFO - Epoch [7][4650/7330] lr: 1.000e-04, eta: 9:01:18, time: 0.823, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0427, loss_cls: 0.1748, acc: 93.6558, loss_bbox: 0.2200, loss_mask: 0.2254, loss: 0.6790 2024-05-31 09:24:31,014 - mmdet - INFO - Epoch [7][4700/7330] lr: 1.000e-04, eta: 9:00:39, time: 0.865, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0431, loss_cls: 0.1797, acc: 93.4287, loss_bbox: 0.2275, loss_mask: 0.2389, loss: 0.7060 2024-05-31 09:25:12,508 - mmdet - INFO - Epoch [7][4750/7330] lr: 1.000e-04, eta: 8:59:57, time: 0.830, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0436, loss_cls: 0.1764, acc: 93.4404, loss_bbox: 0.2281, loss_mask: 0.2352, loss: 0.7013 2024-05-31 09:25:53,371 - mmdet - INFO - Epoch [7][4800/7330] lr: 1.000e-04, eta: 8:59:16, time: 0.817, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0431, loss_cls: 0.1854, acc: 93.2429, loss_bbox: 0.2337, loss_mask: 0.2369, loss: 0.7163 2024-05-31 09:26:34,450 - mmdet - INFO - Epoch [7][4850/7330] lr: 1.000e-04, eta: 8:58:34, time: 0.822, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0449, loss_cls: 0.1813, acc: 93.3264, loss_bbox: 0.2322, loss_mask: 0.2343, loss: 0.7108 2024-05-31 09:27:16,963 - mmdet - INFO - Epoch [7][4900/7330] lr: 1.000e-04, eta: 8:57:54, time: 0.850, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0425, loss_cls: 0.1752, acc: 93.4951, loss_bbox: 0.2254, loss_mask: 0.2334, loss: 0.6931 2024-05-31 09:27:58,652 - mmdet - INFO - Epoch [7][4950/7330] lr: 1.000e-04, eta: 8:57:13, time: 0.834, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0448, loss_cls: 0.1838, acc: 93.2886, loss_bbox: 0.2328, loss_mask: 0.2341, loss: 0.7135 2024-05-31 09:28:38,718 - mmdet - INFO - Epoch [7][5000/7330] lr: 1.000e-04, eta: 8:56:31, time: 0.801, data_time: 0.040, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0401, loss_cls: 0.1735, acc: 93.6519, loss_bbox: 0.2222, loss_mask: 0.2297, loss: 0.6815 2024-05-31 09:29:19,329 - mmdet - INFO - Epoch [7][5050/7330] lr: 1.000e-04, eta: 8:55:49, time: 0.812, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0390, loss_cls: 0.1727, acc: 93.5901, loss_bbox: 0.2214, loss_mask: 0.2243, loss: 0.6730 2024-05-31 09:30:00,362 - mmdet - INFO - Epoch [7][5100/7330] lr: 1.000e-04, eta: 8:55:07, time: 0.821, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0455, loss_cls: 0.1825, acc: 93.1938, loss_bbox: 0.2400, loss_mask: 0.2380, loss: 0.7236 2024-05-31 09:30:40,797 - mmdet - INFO - Epoch [7][5150/7330] lr: 1.000e-04, eta: 8:54:25, time: 0.809, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0409, loss_cls: 0.1733, acc: 93.5522, loss_bbox: 0.2240, loss_mask: 0.2351, loss: 0.6901 2024-05-31 09:31:21,831 - mmdet - INFO - Epoch [7][5200/7330] lr: 1.000e-04, eta: 8:53:44, time: 0.821, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0452, loss_cls: 0.1749, acc: 93.5493, loss_bbox: 0.2257, loss_mask: 0.2283, loss: 0.6908 2024-05-31 09:32:04,158 - mmdet - INFO - Epoch [7][5250/7330] lr: 1.000e-04, eta: 8:53:03, time: 0.847, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0426, loss_cls: 0.1765, acc: 93.4475, loss_bbox: 0.2215, loss_mask: 0.2305, loss: 0.6878 2024-05-31 09:32:44,635 - mmdet - INFO - Epoch [7][5300/7330] lr: 1.000e-04, eta: 8:52:22, time: 0.810, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0436, loss_cls: 0.1744, acc: 93.5574, loss_bbox: 0.2229, loss_mask: 0.2326, loss: 0.6906 2024-05-31 09:33:29,611 - mmdet - INFO - Epoch [7][5350/7330] lr: 1.000e-04, eta: 8:51:43, time: 0.900, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0441, loss_cls: 0.1793, acc: 93.3301, loss_bbox: 0.2296, loss_mask: 0.2341, loss: 0.7045 2024-05-31 09:34:13,157 - mmdet - INFO - Epoch [7][5400/7330] lr: 1.000e-04, eta: 8:51:04, time: 0.870, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0422, loss_cls: 0.1821, acc: 93.3232, loss_bbox: 0.2244, loss_mask: 0.2310, loss: 0.6967 2024-05-31 09:34:53,588 - mmdet - INFO - Epoch [7][5450/7330] lr: 1.000e-04, eta: 8:50:22, time: 0.809, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0420, loss_cls: 0.1757, acc: 93.4756, loss_bbox: 0.2221, loss_mask: 0.2280, loss: 0.6834 2024-05-31 09:35:39,961 - mmdet - INFO - Epoch [7][5500/7330] lr: 1.000e-04, eta: 8:49:44, time: 0.927, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0421, loss_cls: 0.1769, acc: 93.4148, loss_bbox: 0.2250, loss_mask: 0.2298, loss: 0.6907 2024-05-31 09:36:20,401 - mmdet - INFO - Epoch [7][5550/7330] lr: 1.000e-04, eta: 8:49:02, time: 0.809, data_time: 0.041, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0405, loss_cls: 0.1739, acc: 93.6187, loss_bbox: 0.2217, loss_mask: 0.2301, loss: 0.6828 2024-05-31 09:37:03,647 - mmdet - INFO - Epoch [7][5600/7330] lr: 1.000e-04, eta: 8:48:23, time: 0.865, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0438, loss_cls: 0.1792, acc: 93.4365, loss_bbox: 0.2295, loss_mask: 0.2311, loss: 0.7012 2024-05-31 09:37:45,223 - mmdet - INFO - Epoch [7][5650/7330] lr: 1.000e-04, eta: 8:47:41, time: 0.831, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0443, loss_cls: 0.1792, acc: 93.4685, loss_bbox: 0.2285, loss_mask: 0.2346, loss: 0.7041 2024-05-31 09:38:25,172 - mmdet - INFO - Epoch [7][5700/7330] lr: 1.000e-04, eta: 8:46:59, time: 0.799, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0397, loss_cls: 0.1658, acc: 93.9441, loss_bbox: 0.2143, loss_mask: 0.2234, loss: 0.6585 2024-05-31 09:39:08,120 - mmdet - INFO - Epoch [7][5750/7330] lr: 1.000e-04, eta: 8:46:19, time: 0.859, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0452, loss_cls: 0.1828, acc: 93.2112, loss_bbox: 0.2345, loss_mask: 0.2335, loss: 0.7141 2024-05-31 09:39:48,353 - mmdet - INFO - Epoch [7][5800/7330] lr: 1.000e-04, eta: 8:45:37, time: 0.805, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0415, loss_cls: 0.1821, acc: 93.1973, loss_bbox: 0.2306, loss_mask: 0.2317, loss: 0.7022 2024-05-31 09:40:29,077 - mmdet - INFO - Epoch [7][5850/7330] lr: 1.000e-04, eta: 8:44:55, time: 0.815, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0420, loss_cls: 0.1794, acc: 93.4058, loss_bbox: 0.2266, loss_mask: 0.2275, loss: 0.6927 2024-05-31 09:41:10,431 - mmdet - INFO - Epoch [7][5900/7330] lr: 1.000e-04, eta: 8:44:14, time: 0.827, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0457, loss_cls: 0.1873, acc: 93.0510, loss_bbox: 0.2405, loss_mask: 0.2330, loss: 0.7264 2024-05-31 09:41:53,759 - mmdet - INFO - Epoch [7][5950/7330] lr: 1.000e-04, eta: 8:43:34, time: 0.867, data_time: 0.073, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0424, loss_cls: 0.1807, acc: 93.2832, loss_bbox: 0.2246, loss_mask: 0.2303, loss: 0.6947 2024-05-31 09:42:34,405 - mmdet - INFO - Epoch [7][6000/7330] lr: 1.000e-04, eta: 8:42:52, time: 0.813, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0426, loss_cls: 0.1826, acc: 93.2988, loss_bbox: 0.2303, loss_mask: 0.2333, loss: 0.7061 2024-05-31 09:43:15,948 - mmdet - INFO - Epoch [7][6050/7330] lr: 1.000e-04, eta: 8:42:11, time: 0.831, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0449, loss_cls: 0.1818, acc: 93.1536, loss_bbox: 0.2353, loss_mask: 0.2382, loss: 0.7169 2024-05-31 09:43:56,512 - mmdet - INFO - Epoch [7][6100/7330] lr: 1.000e-04, eta: 8:41:29, time: 0.811, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0427, loss_cls: 0.1762, acc: 93.5657, loss_bbox: 0.2262, loss_mask: 0.2354, loss: 0.6971 2024-05-31 09:44:37,395 - mmdet - INFO - Epoch [7][6150/7330] lr: 1.000e-04, eta: 8:40:48, time: 0.818, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0433, loss_cls: 0.1790, acc: 93.4297, loss_bbox: 0.2293, loss_mask: 0.2318, loss: 0.7002 2024-05-31 09:45:18,460 - mmdet - INFO - Epoch [7][6200/7330] lr: 1.000e-04, eta: 8:40:06, time: 0.821, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0464, loss_cls: 0.1837, acc: 93.1016, loss_bbox: 0.2381, loss_mask: 0.2359, loss: 0.7230 2024-05-31 09:45:58,961 - mmdet - INFO - Epoch [7][6250/7330] lr: 1.000e-04, eta: 8:39:24, time: 0.810, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0403, loss_cls: 0.1717, acc: 93.5952, loss_bbox: 0.2206, loss_mask: 0.2273, loss: 0.6758 2024-05-31 09:46:42,592 - mmdet - INFO - Epoch [7][6300/7330] lr: 1.000e-04, eta: 8:38:45, time: 0.873, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0468, loss_cls: 0.1944, acc: 92.8025, loss_bbox: 0.2462, loss_mask: 0.2382, loss: 0.7452 2024-05-31 09:47:23,566 - mmdet - INFO - Epoch [7][6350/7330] lr: 1.000e-04, eta: 8:38:03, time: 0.820, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0424, loss_cls: 0.1833, acc: 93.3059, loss_bbox: 0.2318, loss_mask: 0.2369, loss: 0.7123 2024-05-31 09:48:06,815 - mmdet - INFO - Epoch [7][6400/7330] lr: 1.000e-04, eta: 8:37:23, time: 0.865, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0410, loss_cls: 0.1813, acc: 93.3379, loss_bbox: 0.2277, loss_mask: 0.2335, loss: 0.7004 2024-05-31 09:48:50,648 - mmdet - INFO - Epoch [7][6450/7330] lr: 1.000e-04, eta: 8:36:44, time: 0.877, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0426, loss_cls: 0.1828, acc: 93.2063, loss_bbox: 0.2350, loss_mask: 0.2363, loss: 0.7145 2024-05-31 09:49:31,116 - mmdet - INFO - Epoch [7][6500/7330] lr: 1.000e-04, eta: 8:36:02, time: 0.809, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0438, loss_cls: 0.1728, acc: 93.6484, loss_bbox: 0.2268, loss_mask: 0.2318, loss: 0.6907 2024-05-31 09:50:14,352 - mmdet - INFO - Epoch [7][6550/7330] lr: 1.000e-04, eta: 8:35:22, time: 0.865, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0437, loss_cls: 0.1866, acc: 93.2517, loss_bbox: 0.2300, loss_mask: 0.2394, loss: 0.7153 2024-05-31 09:50:57,240 - mmdet - INFO - Epoch [7][6600/7330] lr: 1.000e-04, eta: 8:34:42, time: 0.858, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0420, loss_cls: 0.1791, acc: 93.3457, loss_bbox: 0.2298, loss_mask: 0.2325, loss: 0.7005 2024-05-31 09:51:40,470 - mmdet - INFO - Epoch [7][6650/7330] lr: 1.000e-04, eta: 8:34:02, time: 0.865, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0441, loss_cls: 0.1794, acc: 93.4338, loss_bbox: 0.2277, loss_mask: 0.2263, loss: 0.6956 2024-05-31 09:52:21,352 - mmdet - INFO - Epoch [7][6700/7330] lr: 1.000e-04, eta: 8:33:21, time: 0.818, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0425, loss_cls: 0.1801, acc: 93.3435, loss_bbox: 0.2289, loss_mask: 0.2259, loss: 0.6962 2024-05-31 09:53:01,995 - mmdet - INFO - Epoch [7][6750/7330] lr: 1.000e-04, eta: 8:32:39, time: 0.813, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0450, loss_cls: 0.1834, acc: 93.2393, loss_bbox: 0.2310, loss_mask: 0.2304, loss: 0.7061 2024-05-31 09:53:42,946 - mmdet - INFO - Epoch [7][6800/7330] lr: 1.000e-04, eta: 8:31:57, time: 0.819, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0448, loss_cls: 0.1805, acc: 93.2751, loss_bbox: 0.2335, loss_mask: 0.2337, loss: 0.7105 2024-05-31 09:54:25,580 - mmdet - INFO - Epoch [7][6850/7330] lr: 1.000e-04, eta: 8:31:17, time: 0.853, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0434, loss_cls: 0.1880, acc: 93.1096, loss_bbox: 0.2359, loss_mask: 0.2382, loss: 0.7226 2024-05-31 09:55:06,517 - mmdet - INFO - Epoch [7][6900/7330] lr: 1.000e-04, eta: 8:30:35, time: 0.819, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0432, loss_cls: 0.1776, acc: 93.3611, loss_bbox: 0.2249, loss_mask: 0.2311, loss: 0.6935 2024-05-31 09:55:46,854 - mmdet - INFO - Epoch [7][6950/7330] lr: 1.000e-04, eta: 8:29:53, time: 0.807, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0408, loss_cls: 0.1721, acc: 93.5796, loss_bbox: 0.2145, loss_mask: 0.2231, loss: 0.6673 2024-05-31 09:56:29,492 - mmdet - INFO - Epoch [7][7000/7330] lr: 1.000e-04, eta: 8:29:13, time: 0.853, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0405, loss_cls: 0.1723, acc: 93.6858, loss_bbox: 0.2211, loss_mask: 0.2322, loss: 0.6831 2024-05-31 09:57:09,590 - mmdet - INFO - Epoch [7][7050/7330] lr: 1.000e-04, eta: 8:28:31, time: 0.802, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0429, loss_cls: 0.1790, acc: 93.3315, loss_bbox: 0.2288, loss_mask: 0.2378, loss: 0.7052 2024-05-31 09:57:50,502 - mmdet - INFO - Epoch [7][7100/7330] lr: 1.000e-04, eta: 8:27:49, time: 0.818, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0464, loss_cls: 0.1882, acc: 93.0352, loss_bbox: 0.2356, loss_mask: 0.2293, loss: 0.7179 2024-05-31 09:58:31,046 - mmdet - INFO - Epoch [7][7150/7330] lr: 1.000e-04, eta: 8:27:07, time: 0.811, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0411, loss_cls: 0.1669, acc: 93.8176, loss_bbox: 0.2224, loss_mask: 0.2252, loss: 0.6726 2024-05-31 09:59:11,217 - mmdet - INFO - Epoch [7][7200/7330] lr: 1.000e-04, eta: 8:26:25, time: 0.803, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0420, loss_cls: 0.1710, acc: 93.6914, loss_bbox: 0.2221, loss_mask: 0.2324, loss: 0.6831 2024-05-31 09:59:52,649 - mmdet - INFO - Epoch [7][7250/7330] lr: 1.000e-04, eta: 8:25:44, time: 0.829, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0447, loss_cls: 0.1822, acc: 93.3806, loss_bbox: 0.2293, loss_mask: 0.2307, loss: 0.7045 2024-05-31 10:00:33,126 - mmdet - INFO - Epoch [7][7300/7330] lr: 1.000e-04, eta: 8:25:02, time: 0.810, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0442, loss_cls: 0.1784, acc: 93.3069, loss_bbox: 0.2359, loss_mask: 0.2318, loss: 0.7064 2024-05-31 10:00:57,883 - mmdet - INFO - Saving checkpoint at 7 epochs 2024-05-31 10:02:46,658 - mmdet - INFO - Evaluating bbox... 2024-05-31 10:03:08,021 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.464 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.703 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.511 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.289 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.507 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.625 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.585 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.585 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.585 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.394 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.632 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.750 2024-05-31 10:03:08,021 - mmdet - INFO - Evaluating segm... 2024-05-31 10:03:34,687 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.419 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.668 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.446 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.204 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.458 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.636 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.533 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.533 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.533 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.325 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.580 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.725 2024-05-31 10:03:35,015 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 10:03:35,016 - mmdet - INFO - Epoch(val) [7][625] bbox_mAP: 0.4640, bbox_mAP_50: 0.7030, bbox_mAP_75: 0.5110, bbox_mAP_s: 0.2890, bbox_mAP_m: 0.5070, bbox_mAP_l: 0.6250, bbox_mAP_copypaste: 0.464 0.703 0.511 0.289 0.507 0.625, segm_mAP: 0.4190, segm_mAP_50: 0.6680, segm_mAP_75: 0.4460, segm_mAP_s: 0.2040, segm_mAP_m: 0.4580, segm_mAP_l: 0.6360, segm_mAP_copypaste: 0.419 0.668 0.446 0.204 0.458 0.636 2024-05-31 10:04:27,401 - mmdet - INFO - Epoch [8][50/7330] lr: 1.000e-04, eta: 8:23:46, time: 1.047, data_time: 0.137, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0394, loss_cls: 0.1670, acc: 93.8179, loss_bbox: 0.2197, loss_mask: 0.2259, loss: 0.6682 2024-05-31 10:05:10,123 - mmdet - INFO - Epoch [8][100/7330] lr: 1.000e-04, eta: 8:23:06, time: 0.855, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0410, loss_cls: 0.1675, acc: 93.6965, loss_bbox: 0.2179, loss_mask: 0.2241, loss: 0.6649 2024-05-31 10:05:51,334 - mmdet - INFO - Epoch [8][150/7330] lr: 1.000e-04, eta: 8:22:24, time: 0.824, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0438, loss_cls: 0.1729, acc: 93.5537, loss_bbox: 0.2231, loss_mask: 0.2268, loss: 0.6810 2024-05-31 10:06:34,431 - mmdet - INFO - Epoch [8][200/7330] lr: 1.000e-04, eta: 8:21:44, time: 0.862, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0431, loss_cls: 0.1741, acc: 93.4783, loss_bbox: 0.2270, loss_mask: 0.2251, loss: 0.6859 2024-05-31 10:07:15,043 - mmdet - INFO - Epoch [8][250/7330] lr: 1.000e-04, eta: 8:21:03, time: 0.812, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0435, loss_cls: 0.1708, acc: 93.5720, loss_bbox: 0.2243, loss_mask: 0.2255, loss: 0.6802 2024-05-31 10:07:56,122 - mmdet - INFO - Epoch [8][300/7330] lr: 1.000e-04, eta: 8:20:21, time: 0.822, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0459, loss_cls: 0.1752, acc: 93.3748, loss_bbox: 0.2335, loss_mask: 0.2312, loss: 0.7021 2024-05-31 10:08:38,888 - mmdet - INFO - Epoch [8][350/7330] lr: 1.000e-04, eta: 8:19:41, time: 0.855, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0393, loss_cls: 0.1669, acc: 93.8738, loss_bbox: 0.2147, loss_mask: 0.2266, loss: 0.6624 2024-05-31 10:09:20,221 - mmdet - INFO - Epoch [8][400/7330] lr: 1.000e-04, eta: 8:19:00, time: 0.827, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0465, loss_cls: 0.1844, acc: 93.1519, loss_bbox: 0.2378, loss_mask: 0.2302, loss: 0.7164 2024-05-31 10:10:00,332 - mmdet - INFO - Epoch [8][450/7330] lr: 1.000e-04, eta: 8:18:18, time: 0.802, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0413, loss_cls: 0.1704, acc: 93.6340, loss_bbox: 0.2264, loss_mask: 0.2275, loss: 0.6801 2024-05-31 10:10:41,043 - mmdet - INFO - Epoch [8][500/7330] lr: 1.000e-04, eta: 8:17:36, time: 0.814, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0425, loss_cls: 0.1723, acc: 93.6086, loss_bbox: 0.2245, loss_mask: 0.2247, loss: 0.6794 2024-05-31 10:11:21,868 - mmdet - INFO - Epoch [8][550/7330] lr: 1.000e-04, eta: 8:16:54, time: 0.816, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0406, loss_cls: 0.1757, acc: 93.4714, loss_bbox: 0.2260, loss_mask: 0.2247, loss: 0.6818 2024-05-31 10:12:02,696 - mmdet - INFO - Epoch [8][600/7330] lr: 1.000e-04, eta: 8:16:13, time: 0.817, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0404, loss_cls: 0.1667, acc: 93.7642, loss_bbox: 0.2203, loss_mask: 0.2270, loss: 0.6703 2024-05-31 10:12:44,123 - mmdet - INFO - Epoch [8][650/7330] lr: 1.000e-04, eta: 8:15:31, time: 0.829, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0461, loss_cls: 0.1816, acc: 93.1338, loss_bbox: 0.2358, loss_mask: 0.2337, loss: 0.7160 2024-05-31 10:13:24,921 - mmdet - INFO - Epoch [8][700/7330] lr: 1.000e-04, eta: 8:14:50, time: 0.816, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0429, loss_cls: 0.1694, acc: 93.7217, loss_bbox: 0.2133, loss_mask: 0.2256, loss: 0.6675 2024-05-31 10:14:05,139 - mmdet - INFO - Epoch [8][750/7330] lr: 1.000e-04, eta: 8:14:08, time: 0.804, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0416, loss_cls: 0.1673, acc: 93.7483, loss_bbox: 0.2175, loss_mask: 0.2249, loss: 0.6667 2024-05-31 10:14:48,326 - mmdet - INFO - Epoch [8][800/7330] lr: 1.000e-04, eta: 8:13:28, time: 0.864, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0410, loss_cls: 0.1747, acc: 93.4377, loss_bbox: 0.2256, loss_mask: 0.2272, loss: 0.6843 2024-05-31 10:15:28,745 - mmdet - INFO - Epoch [8][850/7330] lr: 1.000e-04, eta: 8:12:46, time: 0.808, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0404, loss_cls: 0.1711, acc: 93.6414, loss_bbox: 0.2235, loss_mask: 0.2241, loss: 0.6740 2024-05-31 10:16:10,412 - mmdet - INFO - Epoch [8][900/7330] lr: 1.000e-04, eta: 8:12:05, time: 0.833, data_time: 0.080, memory: 18874, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0409, loss_cls: 0.1688, acc: 93.7095, loss_bbox: 0.2203, loss_mask: 0.2272, loss: 0.6737 2024-05-31 10:16:53,774 - mmdet - INFO - Epoch [8][950/7330] lr: 1.000e-04, eta: 8:11:25, time: 0.867, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0445, loss_cls: 0.1740, acc: 93.4412, loss_bbox: 0.2274, loss_mask: 0.2297, loss: 0.6920 2024-05-31 10:17:34,047 - mmdet - INFO - Epoch [8][1000/7330] lr: 1.000e-04, eta: 8:10:43, time: 0.805, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0417, loss_cls: 0.1684, acc: 93.7244, loss_bbox: 0.2180, loss_mask: 0.2251, loss: 0.6678 2024-05-31 10:18:15,293 - mmdet - INFO - Epoch [8][1050/7330] lr: 1.000e-04, eta: 8:10:02, time: 0.825, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0433, loss_cls: 0.1712, acc: 93.6414, loss_bbox: 0.2253, loss_mask: 0.2286, loss: 0.6851 2024-05-31 10:18:55,552 - mmdet - INFO - Epoch [8][1100/7330] lr: 1.000e-04, eta: 8:09:20, time: 0.805, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0444, loss_cls: 0.1728, acc: 93.5540, loss_bbox: 0.2270, loss_mask: 0.2292, loss: 0.6897 2024-05-31 10:19:41,098 - mmdet - INFO - Epoch [8][1150/7330] lr: 1.000e-04, eta: 8:08:41, time: 0.911, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0380, loss_cls: 0.1652, acc: 93.8711, loss_bbox: 0.2177, loss_mask: 0.2249, loss: 0.6604 2024-05-31 10:20:21,873 - mmdet - INFO - Epoch [8][1200/7330] lr: 1.000e-04, eta: 8:08:00, time: 0.815, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0420, loss_cls: 0.1718, acc: 93.5132, loss_bbox: 0.2277, loss_mask: 0.2309, loss: 0.6881 2024-05-31 10:21:07,336 - mmdet - INFO - Epoch [8][1250/7330] lr: 1.000e-04, eta: 8:07:21, time: 0.909, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0439, loss_cls: 0.1701, acc: 93.6697, loss_bbox: 0.2219, loss_mask: 0.2257, loss: 0.6780 2024-05-31 10:21:47,733 - mmdet - INFO - Epoch [8][1300/7330] lr: 1.000e-04, eta: 8:06:39, time: 0.808, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0386, loss_cls: 0.1681, acc: 93.8096, loss_bbox: 0.2181, loss_mask: 0.2250, loss: 0.6647 2024-05-31 10:22:30,509 - mmdet - INFO - Epoch [8][1350/7330] lr: 1.000e-04, eta: 8:05:59, time: 0.855, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0413, loss_cls: 0.1717, acc: 93.5701, loss_bbox: 0.2222, loss_mask: 0.2288, loss: 0.6804 2024-05-31 10:23:11,096 - mmdet - INFO - Epoch [8][1400/7330] lr: 1.000e-04, eta: 8:05:17, time: 0.812, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0411, loss_cls: 0.1738, acc: 93.4590, loss_bbox: 0.2299, loss_mask: 0.2258, loss: 0.6855 2024-05-31 10:23:54,317 - mmdet - INFO - Epoch [8][1450/7330] lr: 1.000e-04, eta: 8:04:37, time: 0.864, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0418, loss_cls: 0.1725, acc: 93.6526, loss_bbox: 0.2241, loss_mask: 0.2252, loss: 0.6806 2024-05-31 10:24:34,857 - mmdet - INFO - Epoch [8][1500/7330] lr: 1.000e-04, eta: 8:03:55, time: 0.811, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0416, loss_cls: 0.1617, acc: 93.9407, loss_bbox: 0.2147, loss_mask: 0.2265, loss: 0.6600 2024-05-31 10:25:15,662 - mmdet - INFO - Epoch [8][1550/7330] lr: 1.000e-04, eta: 8:03:14, time: 0.816, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0420, loss_cls: 0.1743, acc: 93.5483, loss_bbox: 0.2255, loss_mask: 0.2250, loss: 0.6826 2024-05-31 10:25:56,532 - mmdet - INFO - Epoch [8][1600/7330] lr: 1.000e-04, eta: 8:02:32, time: 0.817, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0440, loss_cls: 0.1723, acc: 93.5942, loss_bbox: 0.2263, loss_mask: 0.2327, loss: 0.6910 2024-05-31 10:26:37,123 - mmdet - INFO - Epoch [8][1650/7330] lr: 1.000e-04, eta: 8:01:50, time: 0.812, data_time: 0.075, memory: 18874, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0406, loss_cls: 0.1725, acc: 93.5107, loss_bbox: 0.2266, loss_mask: 0.2320, loss: 0.6874 2024-05-31 10:27:18,077 - mmdet - INFO - Epoch [8][1700/7330] lr: 1.000e-04, eta: 8:01:09, time: 0.819, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0395, loss_cls: 0.1707, acc: 93.5342, loss_bbox: 0.2212, loss_mask: 0.2240, loss: 0.6695 2024-05-31 10:27:59,172 - mmdet - INFO - Epoch [8][1750/7330] lr: 1.000e-04, eta: 8:00:27, time: 0.822, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0407, loss_cls: 0.1750, acc: 93.3625, loss_bbox: 0.2318, loss_mask: 0.2261, loss: 0.6893 2024-05-31 10:28:40,367 - mmdet - INFO - Epoch [8][1800/7330] lr: 1.000e-04, eta: 7:59:46, time: 0.824, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0450, loss_cls: 0.1767, acc: 93.4395, loss_bbox: 0.2249, loss_mask: 0.2331, loss: 0.6958 2024-05-31 10:29:23,960 - mmdet - INFO - Epoch [8][1850/7330] lr: 1.000e-04, eta: 7:59:06, time: 0.872, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0472, loss_cls: 0.1843, acc: 93.1775, loss_bbox: 0.2384, loss_mask: 0.2374, loss: 0.7253 2024-05-31 10:30:04,312 - mmdet - INFO - Epoch [8][1900/7330] lr: 1.000e-04, eta: 7:58:24, time: 0.807, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0398, loss_cls: 0.1718, acc: 93.5986, loss_bbox: 0.2196, loss_mask: 0.2254, loss: 0.6711 2024-05-31 10:30:44,772 - mmdet - INFO - Epoch [8][1950/7330] lr: 1.000e-04, eta: 7:57:42, time: 0.809, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0406, loss_cls: 0.1693, acc: 93.7090, loss_bbox: 0.2216, loss_mask: 0.2302, loss: 0.6761 2024-05-31 10:31:28,337 - mmdet - INFO - Epoch [8][2000/7330] lr: 1.000e-04, eta: 7:57:02, time: 0.871, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0426, loss_cls: 0.1761, acc: 93.3975, loss_bbox: 0.2254, loss_mask: 0.2279, loss: 0.6888 2024-05-31 10:32:08,610 - mmdet - INFO - Epoch [8][2050/7330] lr: 1.000e-04, eta: 7:56:21, time: 0.806, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0405, loss_cls: 0.1675, acc: 93.8518, loss_bbox: 0.2160, loss_mask: 0.2246, loss: 0.6632 2024-05-31 10:32:49,223 - mmdet - INFO - Epoch [8][2100/7330] lr: 1.000e-04, eta: 7:55:39, time: 0.812, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0411, loss_cls: 0.1707, acc: 93.6443, loss_bbox: 0.2240, loss_mask: 0.2288, loss: 0.6811 2024-05-31 10:33:29,280 - mmdet - INFO - Epoch [8][2150/7330] lr: 1.000e-04, eta: 7:54:57, time: 0.801, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0416, loss_cls: 0.1757, acc: 93.4116, loss_bbox: 0.2255, loss_mask: 0.2266, loss: 0.6852 2024-05-31 10:34:12,177 - mmdet - INFO - Epoch [8][2200/7330] lr: 1.000e-04, eta: 7:54:16, time: 0.858, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0383, loss_cls: 0.1663, acc: 93.8359, loss_bbox: 0.2190, loss_mask: 0.2264, loss: 0.6637 2024-05-31 10:34:56,550 - mmdet - INFO - Epoch [8][2250/7330] lr: 1.000e-04, eta: 7:53:37, time: 0.887, data_time: 0.076, memory: 18874, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0417, loss_cls: 0.1704, acc: 93.5144, loss_bbox: 0.2250, loss_mask: 0.2208, loss: 0.6721 2024-05-31 10:35:43,862 - mmdet - INFO - Epoch [8][2300/7330] lr: 1.000e-04, eta: 7:53:00, time: 0.947, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0413, loss_cls: 0.1612, acc: 93.9331, loss_bbox: 0.2177, loss_mask: 0.2231, loss: 0.6583 2024-05-31 10:36:24,459 - mmdet - INFO - Epoch [8][2350/7330] lr: 1.000e-04, eta: 7:52:18, time: 0.812, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0400, loss_cls: 0.1663, acc: 93.7253, loss_bbox: 0.2178, loss_mask: 0.2245, loss: 0.6643 2024-05-31 10:37:06,562 - mmdet - INFO - Epoch [8][2400/7330] lr: 1.000e-04, eta: 7:51:37, time: 0.842, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0423, loss_cls: 0.1736, acc: 93.4746, loss_bbox: 0.2266, loss_mask: 0.2284, loss: 0.6857 2024-05-31 10:37:47,594 - mmdet - INFO - Epoch [8][2450/7330] lr: 1.000e-04, eta: 7:50:56, time: 0.820, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0421, loss_cls: 0.1680, acc: 93.6965, loss_bbox: 0.2191, loss_mask: 0.2227, loss: 0.6675 2024-05-31 10:38:30,998 - mmdet - INFO - Epoch [8][2500/7330] lr: 1.000e-04, eta: 7:50:16, time: 0.868, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0419, loss_cls: 0.1680, acc: 93.7156, loss_bbox: 0.2186, loss_mask: 0.2244, loss: 0.6698 2024-05-31 10:39:11,531 - mmdet - INFO - Epoch [8][2550/7330] lr: 1.000e-04, eta: 7:49:34, time: 0.811, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0416, loss_cls: 0.1708, acc: 93.6472, loss_bbox: 0.2237, loss_mask: 0.2259, loss: 0.6788 2024-05-31 10:39:52,269 - mmdet - INFO - Epoch [8][2600/7330] lr: 1.000e-04, eta: 7:48:52, time: 0.815, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0415, loss_cls: 0.1720, acc: 93.5417, loss_bbox: 0.2290, loss_mask: 0.2346, loss: 0.6943 2024-05-31 10:40:32,817 - mmdet - INFO - Epoch [8][2650/7330] lr: 1.000e-04, eta: 7:48:10, time: 0.811, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0411, loss_cls: 0.1767, acc: 93.4287, loss_bbox: 0.2304, loss_mask: 0.2312, loss: 0.6941 2024-05-31 10:41:13,797 - mmdet - INFO - Epoch [8][2700/7330] lr: 1.000e-04, eta: 7:47:29, time: 0.820, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0412, loss_cls: 0.1695, acc: 93.6980, loss_bbox: 0.2176, loss_mask: 0.2223, loss: 0.6667 2024-05-31 10:41:54,419 - mmdet - INFO - Epoch [8][2750/7330] lr: 1.000e-04, eta: 7:46:47, time: 0.812, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0407, loss_cls: 0.1671, acc: 93.7231, loss_bbox: 0.2172, loss_mask: 0.2254, loss: 0.6648 2024-05-31 10:42:34,679 - mmdet - INFO - Epoch [8][2800/7330] lr: 1.000e-04, eta: 7:46:05, time: 0.805, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0419, loss_cls: 0.1687, acc: 93.6709, loss_bbox: 0.2225, loss_mask: 0.2264, loss: 0.6746 2024-05-31 10:43:15,200 - mmdet - INFO - Epoch [8][2850/7330] lr: 1.000e-04, eta: 7:45:23, time: 0.811, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0417, loss_cls: 0.1676, acc: 93.7283, loss_bbox: 0.2201, loss_mask: 0.2279, loss: 0.6714 2024-05-31 10:43:58,117 - mmdet - INFO - Epoch [8][2900/7330] lr: 1.000e-04, eta: 7:44:43, time: 0.858, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0414, loss_cls: 0.1807, acc: 93.2305, loss_bbox: 0.2328, loss_mask: 0.2307, loss: 0.7020 2024-05-31 10:44:38,924 - mmdet - INFO - Epoch [8][2950/7330] lr: 1.000e-04, eta: 7:44:01, time: 0.816, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0405, loss_cls: 0.1736, acc: 93.5864, loss_bbox: 0.2246, loss_mask: 0.2273, loss: 0.6828 2024-05-31 10:45:19,375 - mmdet - INFO - Epoch [8][3000/7330] lr: 1.000e-04, eta: 7:43:20, time: 0.809, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0405, loss_cls: 0.1669, acc: 93.8147, loss_bbox: 0.2190, loss_mask: 0.2264, loss: 0.6680 2024-05-31 10:46:03,150 - mmdet - INFO - Epoch [8][3050/7330] lr: 1.000e-04, eta: 7:42:40, time: 0.876, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0423, loss_cls: 0.1738, acc: 93.4917, loss_bbox: 0.2252, loss_mask: 0.2287, loss: 0.6860 2024-05-31 10:46:43,450 - mmdet - INFO - Epoch [8][3100/7330] lr: 1.000e-04, eta: 7:41:58, time: 0.806, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0398, loss_cls: 0.1666, acc: 93.8123, loss_bbox: 0.2135, loss_mask: 0.2261, loss: 0.6626 2024-05-31 10:47:24,371 - mmdet - INFO - Epoch [8][3150/7330] lr: 1.000e-04, eta: 7:41:16, time: 0.818, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0421, loss_cls: 0.1734, acc: 93.5112, loss_bbox: 0.2270, loss_mask: 0.2273, loss: 0.6856 2024-05-31 10:48:05,584 - mmdet - INFO - Epoch [8][3200/7330] lr: 1.000e-04, eta: 7:40:35, time: 0.824, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0428, loss_cls: 0.1718, acc: 93.6335, loss_bbox: 0.2200, loss_mask: 0.2288, loss: 0.6806 2024-05-31 10:48:48,105 - mmdet - INFO - Epoch [8][3250/7330] lr: 1.000e-04, eta: 7:39:54, time: 0.850, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0417, loss_cls: 0.1725, acc: 93.6360, loss_bbox: 0.2246, loss_mask: 0.2276, loss: 0.6821 2024-05-31 10:49:31,471 - mmdet - INFO - Epoch [8][3300/7330] lr: 1.000e-04, eta: 7:39:14, time: 0.867, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0417, loss_cls: 0.1707, acc: 93.6074, loss_bbox: 0.2227, loss_mask: 0.2247, loss: 0.6769 2024-05-31 10:50:14,283 - mmdet - INFO - Epoch [8][3350/7330] lr: 1.000e-04, eta: 7:38:34, time: 0.856, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0436, loss_cls: 0.1785, acc: 93.3784, loss_bbox: 0.2296, loss_mask: 0.2310, loss: 0.6996 2024-05-31 10:50:56,760 - mmdet - INFO - Epoch [8][3400/7330] lr: 1.000e-04, eta: 7:37:53, time: 0.850, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0439, loss_cls: 0.1761, acc: 93.4399, loss_bbox: 0.2265, loss_mask: 0.2325, loss: 0.6958 2024-05-31 10:51:39,681 - mmdet - INFO - Epoch [8][3450/7330] lr: 1.000e-04, eta: 7:37:13, time: 0.858, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0420, loss_cls: 0.1702, acc: 93.7698, loss_bbox: 0.2142, loss_mask: 0.2233, loss: 0.6641 2024-05-31 10:52:20,417 - mmdet - INFO - Epoch [8][3500/7330] lr: 1.000e-04, eta: 7:36:31, time: 0.815, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0413, loss_cls: 0.1782, acc: 93.2573, loss_bbox: 0.2261, loss_mask: 0.2259, loss: 0.6879 2024-05-31 10:53:03,837 - mmdet - INFO - Epoch [8][3550/7330] lr: 1.000e-04, eta: 7:35:51, time: 0.869, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0420, loss_cls: 0.1818, acc: 93.2971, loss_bbox: 0.2341, loss_mask: 0.2326, loss: 0.7065 2024-05-31 10:53:45,104 - mmdet - INFO - Epoch [8][3600/7330] lr: 1.000e-04, eta: 7:35:10, time: 0.825, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0421, loss_cls: 0.1741, acc: 93.4729, loss_bbox: 0.2216, loss_mask: 0.2275, loss: 0.6819 2024-05-31 10:54:26,337 - mmdet - INFO - Epoch [8][3650/7330] lr: 1.000e-04, eta: 7:34:28, time: 0.825, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0445, loss_cls: 0.1805, acc: 93.2793, loss_bbox: 0.2306, loss_mask: 0.2267, loss: 0.6986 2024-05-31 10:55:06,922 - mmdet - INFO - Epoch [8][3700/7330] lr: 1.000e-04, eta: 7:33:47, time: 0.812, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0415, loss_cls: 0.1777, acc: 93.4048, loss_bbox: 0.2228, loss_mask: 0.2295, loss: 0.6883 2024-05-31 10:55:47,322 - mmdet - INFO - Epoch [8][3750/7330] lr: 1.000e-04, eta: 7:33:05, time: 0.808, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0417, loss_cls: 0.1771, acc: 93.4980, loss_bbox: 0.2280, loss_mask: 0.2320, loss: 0.6949 2024-05-31 10:56:27,814 - mmdet - INFO - Epoch [8][3800/7330] lr: 1.000e-04, eta: 7:32:23, time: 0.810, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0399, loss_cls: 0.1692, acc: 93.6863, loss_bbox: 0.2200, loss_mask: 0.2258, loss: 0.6714 2024-05-31 10:57:07,871 - mmdet - INFO - Epoch [8][3850/7330] lr: 1.000e-04, eta: 7:31:41, time: 0.801, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0415, loss_cls: 0.1652, acc: 93.8262, loss_bbox: 0.2167, loss_mask: 0.2238, loss: 0.6622 2024-05-31 10:57:48,733 - mmdet - INFO - Epoch [8][3900/7330] lr: 1.000e-04, eta: 7:30:59, time: 0.817, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0442, loss_cls: 0.1766, acc: 93.2751, loss_bbox: 0.2329, loss_mask: 0.2306, loss: 0.7022 2024-05-31 10:58:31,340 - mmdet - INFO - Epoch [8][3950/7330] lr: 1.000e-04, eta: 7:30:19, time: 0.852, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0405, loss_cls: 0.1657, acc: 93.8406, loss_bbox: 0.2175, loss_mask: 0.2236, loss: 0.6640 2024-05-31 10:59:12,186 - mmdet - INFO - Epoch [8][4000/7330] lr: 1.000e-04, eta: 7:29:37, time: 0.817, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0463, loss_cls: 0.1806, acc: 93.2422, loss_bbox: 0.2366, loss_mask: 0.2320, loss: 0.7118 2024-05-31 10:59:52,683 - mmdet - INFO - Epoch [8][4050/7330] lr: 1.000e-04, eta: 7:28:55, time: 0.810, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0402, loss_cls: 0.1677, acc: 93.7148, loss_bbox: 0.2164, loss_mask: 0.2236, loss: 0.6640 2024-05-31 11:00:36,559 - mmdet - INFO - Epoch [8][4100/7330] lr: 1.000e-04, eta: 7:28:16, time: 0.877, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0420, loss_cls: 0.1720, acc: 93.5049, loss_bbox: 0.2244, loss_mask: 0.2302, loss: 0.6846 2024-05-31 11:01:17,394 - mmdet - INFO - Epoch [8][4150/7330] lr: 1.000e-04, eta: 7:27:34, time: 0.817, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0419, loss_cls: 0.1774, acc: 93.4224, loss_bbox: 0.2291, loss_mask: 0.2291, loss: 0.6929 2024-05-31 11:01:58,407 - mmdet - INFO - Epoch [8][4200/7330] lr: 1.000e-04, eta: 7:26:53, time: 0.820, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0438, loss_cls: 0.1731, acc: 93.5588, loss_bbox: 0.2233, loss_mask: 0.2281, loss: 0.6845 2024-05-31 11:02:39,023 - mmdet - INFO - Epoch [8][4250/7330] lr: 1.000e-04, eta: 7:26:11, time: 0.812, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0409, loss_cls: 0.1712, acc: 93.5872, loss_bbox: 0.2181, loss_mask: 0.2263, loss: 0.6714 2024-05-31 11:03:22,738 - mmdet - INFO - Epoch [8][4300/7330] lr: 1.000e-04, eta: 7:25:31, time: 0.874, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0434, loss_cls: 0.1777, acc: 93.2998, loss_bbox: 0.2262, loss_mask: 0.2306, loss: 0.6940 2024-05-31 11:04:06,486 - mmdet - INFO - Epoch [8][4350/7330] lr: 1.000e-04, eta: 7:24:51, time: 0.875, data_time: 0.073, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0413, loss_cls: 0.1732, acc: 93.5415, loss_bbox: 0.2234, loss_mask: 0.2263, loss: 0.6801 2024-05-31 11:04:49,694 - mmdet - INFO - Epoch [8][4400/7330] lr: 1.000e-04, eta: 7:24:11, time: 0.864, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0418, loss_cls: 0.1776, acc: 93.2876, loss_bbox: 0.2291, loss_mask: 0.2315, loss: 0.6961 2024-05-31 11:05:31,960 - mmdet - INFO - Epoch [8][4450/7330] lr: 1.000e-04, eta: 7:23:30, time: 0.845, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0399, loss_cls: 0.1638, acc: 94.0427, loss_bbox: 0.2124, loss_mask: 0.2262, loss: 0.6574 2024-05-31 11:06:15,566 - mmdet - INFO - Epoch [8][4500/7330] lr: 1.000e-04, eta: 7:22:50, time: 0.872, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0415, loss_cls: 0.1755, acc: 93.5115, loss_bbox: 0.2269, loss_mask: 0.2262, loss: 0.6862 2024-05-31 11:06:56,364 - mmdet - INFO - Epoch [8][4550/7330] lr: 1.000e-04, eta: 7:22:08, time: 0.816, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0413, loss_cls: 0.1729, acc: 93.5359, loss_bbox: 0.2222, loss_mask: 0.2261, loss: 0.6792 2024-05-31 11:07:39,976 - mmdet - INFO - Epoch [8][4600/7330] lr: 1.000e-04, eta: 7:21:28, time: 0.872, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0418, loss_cls: 0.1728, acc: 93.5037, loss_bbox: 0.2221, loss_mask: 0.2255, loss: 0.6772 2024-05-31 11:08:20,700 - mmdet - INFO - Epoch [8][4650/7330] lr: 1.000e-04, eta: 7:20:47, time: 0.814, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0406, loss_cls: 0.1722, acc: 93.6379, loss_bbox: 0.2193, loss_mask: 0.2238, loss: 0.6708 2024-05-31 11:09:01,320 - mmdet - INFO - Epoch [8][4700/7330] lr: 1.000e-04, eta: 7:20:05, time: 0.812, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0407, loss_cls: 0.1720, acc: 93.6418, loss_bbox: 0.2192, loss_mask: 0.2244, loss: 0.6709 2024-05-31 11:09:42,017 - mmdet - INFO - Epoch [8][4750/7330] lr: 1.000e-04, eta: 7:19:23, time: 0.814, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0401, loss_cls: 0.1725, acc: 93.5425, loss_bbox: 0.2187, loss_mask: 0.2313, loss: 0.6783 2024-05-31 11:10:22,701 - mmdet - INFO - Epoch [8][4800/7330] lr: 1.000e-04, eta: 7:18:42, time: 0.814, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0409, loss_cls: 0.1777, acc: 93.4348, loss_bbox: 0.2205, loss_mask: 0.2311, loss: 0.6864 2024-05-31 11:11:03,295 - mmdet - INFO - Epoch [8][4850/7330] lr: 1.000e-04, eta: 7:18:00, time: 0.812, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0443, loss_cls: 0.1804, acc: 93.3660, loss_bbox: 0.2320, loss_mask: 0.2339, loss: 0.7063 2024-05-31 11:11:44,168 - mmdet - INFO - Epoch [8][4900/7330] lr: 1.000e-04, eta: 7:17:18, time: 0.817, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0429, loss_cls: 0.1750, acc: 93.4592, loss_bbox: 0.2272, loss_mask: 0.2345, loss: 0.6962 2024-05-31 11:12:24,389 - mmdet - INFO - Epoch [8][4950/7330] lr: 1.000e-04, eta: 7:16:36, time: 0.804, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0423, loss_cls: 0.1715, acc: 93.5857, loss_bbox: 0.2206, loss_mask: 0.2297, loss: 0.6802 2024-05-31 11:13:07,152 - mmdet - INFO - Epoch [8][5000/7330] lr: 1.000e-04, eta: 7:15:56, time: 0.855, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0418, loss_cls: 0.1725, acc: 93.5830, loss_bbox: 0.2202, loss_mask: 0.2264, loss: 0.6774 2024-05-31 11:13:47,603 - mmdet - INFO - Epoch [8][5050/7330] lr: 1.000e-04, eta: 7:15:14, time: 0.809, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0430, loss_cls: 0.1763, acc: 93.5085, loss_bbox: 0.2286, loss_mask: 0.2248, loss: 0.6885 2024-05-31 11:14:28,086 - mmdet - INFO - Epoch [8][5100/7330] lr: 1.000e-04, eta: 7:14:32, time: 0.810, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0412, loss_cls: 0.1733, acc: 93.4590, loss_bbox: 0.2242, loss_mask: 0.2276, loss: 0.6825 2024-05-31 11:15:11,678 - mmdet - INFO - Epoch [8][5150/7330] lr: 1.000e-04, eta: 7:13:52, time: 0.872, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0413, loss_cls: 0.1785, acc: 93.4248, loss_bbox: 0.2222, loss_mask: 0.2332, loss: 0.6906 2024-05-31 11:15:51,852 - mmdet - INFO - Epoch [8][5200/7330] lr: 1.000e-04, eta: 7:13:10, time: 0.803, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0396, loss_cls: 0.1738, acc: 93.6060, loss_bbox: 0.2216, loss_mask: 0.2287, loss: 0.6784 2024-05-31 11:16:32,834 - mmdet - INFO - Epoch [8][5250/7330] lr: 1.000e-04, eta: 7:12:29, time: 0.820, data_time: 0.073, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0435, loss_cls: 0.1729, acc: 93.5081, loss_bbox: 0.2275, loss_mask: 0.2324, loss: 0.6922 2024-05-31 11:17:13,202 - mmdet - INFO - Epoch [8][5300/7330] lr: 1.000e-04, eta: 7:11:47, time: 0.807, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0410, loss_cls: 0.1693, acc: 93.7551, loss_bbox: 0.2192, loss_mask: 0.2246, loss: 0.6693 2024-05-31 11:17:55,693 - mmdet - INFO - Epoch [8][5350/7330] lr: 1.000e-04, eta: 7:11:06, time: 0.850, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0410, loss_cls: 0.1718, acc: 93.5203, loss_bbox: 0.2238, loss_mask: 0.2296, loss: 0.6806 2024-05-31 11:18:38,947 - mmdet - INFO - Epoch [8][5400/7330] lr: 1.000e-04, eta: 7:10:26, time: 0.865, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0448, loss_cls: 0.1756, acc: 93.5493, loss_bbox: 0.2271, loss_mask: 0.2216, loss: 0.6864 2024-05-31 11:19:19,628 - mmdet - INFO - Epoch [8][5450/7330] lr: 1.000e-04, eta: 7:09:44, time: 0.814, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0435, loss_cls: 0.1802, acc: 93.2065, loss_bbox: 0.2292, loss_mask: 0.2255, loss: 0.6949 2024-05-31 11:20:06,671 - mmdet - INFO - Epoch [8][5500/7330] lr: 1.000e-04, eta: 7:09:06, time: 0.941, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0429, loss_cls: 0.1727, acc: 93.4966, loss_bbox: 0.2313, loss_mask: 0.2306, loss: 0.6940 2024-05-31 11:20:49,987 - mmdet - INFO - Epoch [8][5550/7330] lr: 1.000e-04, eta: 7:08:26, time: 0.866, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0442, loss_cls: 0.1811, acc: 93.1699, loss_bbox: 0.2318, loss_mask: 0.2310, loss: 0.7045 2024-05-31 11:21:31,015 - mmdet - INFO - Epoch [8][5600/7330] lr: 1.000e-04, eta: 7:07:44, time: 0.821, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0438, loss_cls: 0.1717, acc: 93.6987, loss_bbox: 0.2204, loss_mask: 0.2268, loss: 0.6798 2024-05-31 11:22:14,580 - mmdet - INFO - Epoch [8][5650/7330] lr: 1.000e-04, eta: 7:07:04, time: 0.871, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0426, loss_cls: 0.1758, acc: 93.4404, loss_bbox: 0.2297, loss_mask: 0.2317, loss: 0.6961 2024-05-31 11:22:55,009 - mmdet - INFO - Epoch [8][5700/7330] lr: 1.000e-04, eta: 7:06:22, time: 0.809, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0422, loss_cls: 0.1678, acc: 93.7742, loss_bbox: 0.2173, loss_mask: 0.2272, loss: 0.6697 2024-05-31 11:23:35,894 - mmdet - INFO - Epoch [8][5750/7330] lr: 1.000e-04, eta: 7:05:41, time: 0.818, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0397, loss_cls: 0.1745, acc: 93.5625, loss_bbox: 0.2208, loss_mask: 0.2244, loss: 0.6754 2024-05-31 11:24:16,553 - mmdet - INFO - Epoch [8][5800/7330] lr: 1.000e-04, eta: 7:04:59, time: 0.813, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0422, loss_cls: 0.1765, acc: 93.4829, loss_bbox: 0.2231, loss_mask: 0.2250, loss: 0.6829 2024-05-31 11:24:56,880 - mmdet - INFO - Epoch [8][5850/7330] lr: 1.000e-04, eta: 7:04:17, time: 0.806, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0392, loss_cls: 0.1601, acc: 93.9631, loss_bbox: 0.2124, loss_mask: 0.2225, loss: 0.6490 2024-05-31 11:25:37,492 - mmdet - INFO - Epoch [8][5900/7330] lr: 1.000e-04, eta: 7:03:36, time: 0.813, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0429, loss_cls: 0.1775, acc: 93.4539, loss_bbox: 0.2275, loss_mask: 0.2275, loss: 0.6915 2024-05-31 11:26:18,151 - mmdet - INFO - Epoch [8][5950/7330] lr: 1.000e-04, eta: 7:02:54, time: 0.813, data_time: 0.041, memory: 18874, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0419, loss_cls: 0.1763, acc: 93.4231, loss_bbox: 0.2262, loss_mask: 0.2294, loss: 0.6896 2024-05-31 11:26:59,041 - mmdet - INFO - Epoch [8][6000/7330] lr: 1.000e-04, eta: 7:02:12, time: 0.818, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0447, loss_cls: 0.1836, acc: 93.2410, loss_bbox: 0.2311, loss_mask: 0.2336, loss: 0.7095 2024-05-31 11:27:40,107 - mmdet - INFO - Epoch [8][6050/7330] lr: 1.000e-04, eta: 7:01:31, time: 0.821, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0461, loss_cls: 0.1793, acc: 93.3210, loss_bbox: 0.2307, loss_mask: 0.2312, loss: 0.7044 2024-05-31 11:28:22,788 - mmdet - INFO - Epoch [8][6100/7330] lr: 1.000e-04, eta: 7:00:50, time: 0.854, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0414, loss_cls: 0.1684, acc: 93.6904, loss_bbox: 0.2186, loss_mask: 0.2262, loss: 0.6712 2024-05-31 11:29:02,973 - mmdet - INFO - Epoch [8][6150/7330] lr: 1.000e-04, eta: 7:00:08, time: 0.804, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0397, loss_cls: 0.1707, acc: 93.5813, loss_bbox: 0.2213, loss_mask: 0.2302, loss: 0.6768 2024-05-31 11:29:45,343 - mmdet - INFO - Epoch [8][6200/7330] lr: 1.000e-04, eta: 6:59:28, time: 0.847, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0405, loss_cls: 0.1718, acc: 93.4990, loss_bbox: 0.2206, loss_mask: 0.2205, loss: 0.6684 2024-05-31 11:30:26,081 - mmdet - INFO - Epoch [8][6250/7330] lr: 1.000e-04, eta: 6:58:46, time: 0.815, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0427, loss_cls: 0.1743, acc: 93.5400, loss_bbox: 0.2246, loss_mask: 0.2305, loss: 0.6885 2024-05-31 11:31:06,891 - mmdet - INFO - Epoch [8][6300/7330] lr: 1.000e-04, eta: 6:58:04, time: 0.816, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0422, loss_cls: 0.1754, acc: 93.4919, loss_bbox: 0.2257, loss_mask: 0.2337, loss: 0.6935 2024-05-31 11:31:47,000 - mmdet - INFO - Epoch [8][6350/7330] lr: 1.000e-04, eta: 6:57:22, time: 0.802, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0415, loss_cls: 0.1756, acc: 93.5857, loss_bbox: 0.2257, loss_mask: 0.2246, loss: 0.6841 2024-05-31 11:32:30,292 - mmdet - INFO - Epoch [8][6400/7330] lr: 1.000e-04, eta: 6:56:42, time: 0.866, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0420, loss_cls: 0.1712, acc: 93.6819, loss_bbox: 0.2217, loss_mask: 0.2290, loss: 0.6804 2024-05-31 11:33:14,792 - mmdet - INFO - Epoch [8][6450/7330] lr: 1.000e-04, eta: 6:56:02, time: 0.890, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0425, loss_cls: 0.1717, acc: 93.6919, loss_bbox: 0.2200, loss_mask: 0.2270, loss: 0.6758 2024-05-31 11:33:55,364 - mmdet - INFO - Epoch [8][6500/7330] lr: 1.000e-04, eta: 6:55:21, time: 0.811, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0417, loss_cls: 0.1727, acc: 93.4373, loss_bbox: 0.2243, loss_mask: 0.2339, loss: 0.6884 2024-05-31 11:34:40,744 - mmdet - INFO - Epoch [8][6550/7330] lr: 1.000e-04, eta: 6:54:42, time: 0.908, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0458, loss_cls: 0.1819, acc: 93.1477, loss_bbox: 0.2358, loss_mask: 0.2349, loss: 0.7155 2024-05-31 11:35:23,567 - mmdet - INFO - Epoch [8][6600/7330] lr: 1.000e-04, eta: 6:54:01, time: 0.856, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0438, loss_cls: 0.1780, acc: 93.3997, loss_bbox: 0.2283, loss_mask: 0.2354, loss: 0.7018 2024-05-31 11:36:04,169 - mmdet - INFO - Epoch [8][6650/7330] lr: 1.000e-04, eta: 6:53:19, time: 0.812, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0416, loss_cls: 0.1752, acc: 93.4978, loss_bbox: 0.2248, loss_mask: 0.2314, loss: 0.6895 2024-05-31 11:36:47,416 - mmdet - INFO - Epoch [8][6700/7330] lr: 1.000e-04, eta: 6:52:39, time: 0.865, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0418, loss_cls: 0.1673, acc: 93.7778, loss_bbox: 0.2187, loss_mask: 0.2245, loss: 0.6665 2024-05-31 11:37:28,416 - mmdet - INFO - Epoch [8][6750/7330] lr: 1.000e-04, eta: 6:51:57, time: 0.820, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0434, loss_cls: 0.1778, acc: 93.3601, loss_bbox: 0.2326, loss_mask: 0.2283, loss: 0.6979 2024-05-31 11:38:08,777 - mmdet - INFO - Epoch [8][6800/7330] lr: 1.000e-04, eta: 6:51:16, time: 0.807, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0414, loss_cls: 0.1713, acc: 93.6262, loss_bbox: 0.2226, loss_mask: 0.2295, loss: 0.6807 2024-05-31 11:38:49,278 - mmdet - INFO - Epoch [8][6850/7330] lr: 1.000e-04, eta: 6:50:34, time: 0.810, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0437, loss_cls: 0.1707, acc: 93.6096, loss_bbox: 0.2258, loss_mask: 0.2329, loss: 0.6895 2024-05-31 11:39:30,063 - mmdet - INFO - Epoch [8][6900/7330] lr: 1.000e-04, eta: 6:49:52, time: 0.816, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0416, loss_cls: 0.1758, acc: 93.3704, loss_bbox: 0.2268, loss_mask: 0.2254, loss: 0.6850 2024-05-31 11:40:10,515 - mmdet - INFO - Epoch [8][6950/7330] lr: 1.000e-04, eta: 6:49:10, time: 0.809, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0401, loss_cls: 0.1712, acc: 93.6423, loss_bbox: 0.2185, loss_mask: 0.2258, loss: 0.6716 2024-05-31 11:40:51,900 - mmdet - INFO - Epoch [8][7000/7330] lr: 1.000e-04, eta: 6:48:29, time: 0.828, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0417, loss_cls: 0.1774, acc: 93.3503, loss_bbox: 0.2263, loss_mask: 0.2246, loss: 0.6869 2024-05-31 11:41:32,747 - mmdet - INFO - Epoch [8][7050/7330] lr: 1.000e-04, eta: 6:47:48, time: 0.817, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0420, loss_cls: 0.1742, acc: 93.5271, loss_bbox: 0.2223, loss_mask: 0.2260, loss: 0.6811 2024-05-31 11:42:13,340 - mmdet - INFO - Epoch [8][7100/7330] lr: 1.000e-04, eta: 6:47:06, time: 0.812, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0437, loss_cls: 0.1730, acc: 93.5774, loss_bbox: 0.2240, loss_mask: 0.2278, loss: 0.6854 2024-05-31 11:42:56,585 - mmdet - INFO - Epoch [8][7150/7330] lr: 1.000e-04, eta: 6:46:25, time: 0.865, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0427, loss_cls: 0.1744, acc: 93.4019, loss_bbox: 0.2282, loss_mask: 0.2329, loss: 0.6947 2024-05-31 11:43:36,650 - mmdet - INFO - Epoch [8][7200/7330] lr: 1.000e-04, eta: 6:45:43, time: 0.801, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0411, loss_cls: 0.1731, acc: 93.5823, loss_bbox: 0.2222, loss_mask: 0.2242, loss: 0.6758 2024-05-31 11:44:19,749 - mmdet - INFO - Epoch [8][7250/7330] lr: 1.000e-04, eta: 6:45:03, time: 0.862, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0399, loss_cls: 0.1746, acc: 93.5559, loss_bbox: 0.2250, loss_mask: 0.2314, loss: 0.6869 2024-05-31 11:45:00,806 - mmdet - INFO - Epoch [8][7300/7330] lr: 1.000e-04, eta: 6:44:22, time: 0.821, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0420, loss_cls: 0.1742, acc: 93.5676, loss_bbox: 0.2240, loss_mask: 0.2231, loss: 0.6802 2024-05-31 11:45:26,147 - mmdet - INFO - Saving checkpoint at 8 epochs 2024-05-31 11:47:10,897 - mmdet - INFO - Evaluating bbox... 2024-05-31 11:47:33,091 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.472 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.705 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.522 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.292 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.517 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.635 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.588 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.588 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.588 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.399 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.630 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.753 2024-05-31 11:47:33,091 - mmdet - INFO - Evaluating segm... 2024-05-31 11:47:55,390 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.424 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.671 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.454 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.211 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.461 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.637 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.530 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.530 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.530 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.328 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.577 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.718 2024-05-31 11:47:55,765 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 11:47:55,766 - mmdet - INFO - Epoch(val) [8][625] bbox_mAP: 0.4720, bbox_mAP_50: 0.7050, bbox_mAP_75: 0.5220, bbox_mAP_s: 0.2920, bbox_mAP_m: 0.5170, bbox_mAP_l: 0.6350, bbox_mAP_copypaste: 0.472 0.705 0.522 0.292 0.517 0.635, segm_mAP: 0.4240, segm_mAP_50: 0.6710, segm_mAP_75: 0.4540, segm_mAP_s: 0.2110, segm_mAP_m: 0.4610, segm_mAP_l: 0.6370, segm_mAP_copypaste: 0.424 0.671 0.454 0.211 0.461 0.637 2024-05-31 11:48:50,071 - mmdet - INFO - Epoch [9][50/7330] lr: 1.000e-05, eta: 6:43:10, time: 1.086, data_time: 0.136, memory: 18874, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0413, loss_cls: 0.1700, acc: 93.5073, loss_bbox: 0.2252, loss_mask: 0.2232, loss: 0.6743 2024-05-31 11:49:35,072 - mmdet - INFO - Epoch [9][100/7330] lr: 1.000e-05, eta: 6:42:30, time: 0.900, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0403, loss_cls: 0.1596, acc: 93.9717, loss_bbox: 0.2086, loss_mask: 0.2164, loss: 0.6389 2024-05-31 11:50:16,196 - mmdet - INFO - Epoch [9][150/7330] lr: 1.000e-05, eta: 6:41:49, time: 0.822, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0408, loss_cls: 0.1651, acc: 93.7864, loss_bbox: 0.2146, loss_mask: 0.2202, loss: 0.6542 2024-05-31 11:50:56,909 - mmdet - INFO - Epoch [9][200/7330] lr: 1.000e-05, eta: 6:41:07, time: 0.814, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0414, loss_cls: 0.1636, acc: 93.6941, loss_bbox: 0.2157, loss_mask: 0.2247, loss: 0.6599 2024-05-31 11:51:39,767 - mmdet - INFO - Epoch [9][250/7330] lr: 1.000e-05, eta: 6:40:26, time: 0.857, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0363, loss_cls: 0.1535, acc: 94.1108, loss_bbox: 0.2008, loss_mask: 0.2177, loss: 0.6210 2024-05-31 11:52:20,126 - mmdet - INFO - Epoch [9][300/7330] lr: 1.000e-05, eta: 6:39:45, time: 0.808, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0389, loss_cls: 0.1604, acc: 94.0166, loss_bbox: 0.2105, loss_mask: 0.2164, loss: 0.6404 2024-05-31 11:53:00,939 - mmdet - INFO - Epoch [9][350/7330] lr: 1.000e-05, eta: 6:39:03, time: 0.816, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0393, loss_cls: 0.1609, acc: 93.9038, loss_bbox: 0.2101, loss_mask: 0.2191, loss: 0.6427 2024-05-31 11:53:41,147 - mmdet - INFO - Epoch [9][400/7330] lr: 1.000e-05, eta: 6:38:21, time: 0.804, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0374, loss_cls: 0.1551, acc: 94.1604, loss_bbox: 0.2042, loss_mask: 0.2178, loss: 0.6282 2024-05-31 11:54:21,887 - mmdet - INFO - Epoch [9][450/7330] lr: 1.000e-05, eta: 6:37:40, time: 0.815, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0388, loss_cls: 0.1620, acc: 93.8843, loss_bbox: 0.2120, loss_mask: 0.2185, loss: 0.6437 2024-05-31 11:55:07,906 - mmdet - INFO - Epoch [9][500/7330] lr: 1.000e-05, eta: 6:37:01, time: 0.920, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0397, loss_cls: 0.1660, acc: 93.7449, loss_bbox: 0.2216, loss_mask: 0.2250, loss: 0.6661 2024-05-31 11:55:48,954 - mmdet - INFO - Epoch [9][550/7330] lr: 1.000e-05, eta: 6:36:19, time: 0.821, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0373, loss_cls: 0.1538, acc: 94.1421, loss_bbox: 0.2022, loss_mask: 0.2184, loss: 0.6249 2024-05-31 11:56:29,814 - mmdet - INFO - Epoch [9][600/7330] lr: 1.000e-05, eta: 6:35:37, time: 0.818, data_time: 0.073, memory: 18874, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0380, loss_cls: 0.1527, acc: 94.1909, loss_bbox: 0.2063, loss_mask: 0.2215, loss: 0.6308 2024-05-31 11:57:13,465 - mmdet - INFO - Epoch [9][650/7330] lr: 1.000e-05, eta: 6:34:57, time: 0.873, data_time: 0.075, memory: 18874, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0410, loss_cls: 0.1615, acc: 93.7800, loss_bbox: 0.2161, loss_mask: 0.2229, loss: 0.6547 2024-05-31 11:57:54,230 - mmdet - INFO - Epoch [9][700/7330] lr: 1.000e-05, eta: 6:34:16, time: 0.815, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0381, loss_cls: 0.1567, acc: 94.0679, loss_bbox: 0.2074, loss_mask: 0.2149, loss: 0.6300 2024-05-31 11:58:34,862 - mmdet - INFO - Epoch [9][750/7330] lr: 1.000e-05, eta: 6:33:34, time: 0.813, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0379, loss_cls: 0.1522, acc: 94.1289, loss_bbox: 0.2087, loss_mask: 0.2151, loss: 0.6255 2024-05-31 11:59:16,113 - mmdet - INFO - Epoch [9][800/7330] lr: 1.000e-05, eta: 6:32:53, time: 0.825, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0397, loss_cls: 0.1592, acc: 94.0037, loss_bbox: 0.2097, loss_mask: 0.2185, loss: 0.6414 2024-05-31 11:59:57,315 - mmdet - INFO - Epoch [9][850/7330] lr: 1.000e-05, eta: 6:32:11, time: 0.824, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0394, loss_cls: 0.1604, acc: 93.9841, loss_bbox: 0.2097, loss_mask: 0.2133, loss: 0.6361 2024-05-31 12:00:40,897 - mmdet - INFO - Epoch [9][900/7330] lr: 1.000e-05, eta: 6:31:31, time: 0.872, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0400, loss_cls: 0.1703, acc: 93.5378, loss_bbox: 0.2227, loss_mask: 0.2232, loss: 0.6704 2024-05-31 12:01:22,101 - mmdet - INFO - Epoch [9][950/7330] lr: 1.000e-05, eta: 6:30:50, time: 0.824, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0396, loss_cls: 0.1605, acc: 93.8838, loss_bbox: 0.2162, loss_mask: 0.2196, loss: 0.6487 2024-05-31 12:02:02,460 - mmdet - INFO - Epoch [9][1000/7330] lr: 1.000e-05, eta: 6:30:08, time: 0.807, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0375, loss_cls: 0.1520, acc: 94.2280, loss_bbox: 0.2039, loss_mask: 0.2149, loss: 0.6222 2024-05-31 12:02:43,288 - mmdet - INFO - Epoch [9][1050/7330] lr: 1.000e-05, eta: 6:29:26, time: 0.817, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0388, loss_cls: 0.1537, acc: 94.1621, loss_bbox: 0.2087, loss_mask: 0.2150, loss: 0.6292 2024-05-31 12:03:23,671 - mmdet - INFO - Epoch [9][1100/7330] lr: 1.000e-05, eta: 6:28:45, time: 0.808, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0371, loss_cls: 0.1519, acc: 94.2219, loss_bbox: 0.2034, loss_mask: 0.2165, loss: 0.6212 2024-05-31 12:04:10,030 - mmdet - INFO - Epoch [9][1150/7330] lr: 1.000e-05, eta: 6:28:06, time: 0.927, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0399, loss_cls: 0.1689, acc: 93.5454, loss_bbox: 0.2214, loss_mask: 0.2240, loss: 0.6682 2024-05-31 12:04:51,627 - mmdet - INFO - Epoch [9][1200/7330] lr: 1.000e-05, eta: 6:27:24, time: 0.832, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0406, loss_cls: 0.1603, acc: 93.8215, loss_bbox: 0.2172, loss_mask: 0.2203, loss: 0.6520 2024-05-31 12:05:32,262 - mmdet - INFO - Epoch [9][1250/7330] lr: 1.000e-05, eta: 6:26:43, time: 0.813, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0376, loss_cls: 0.1542, acc: 94.1321, loss_bbox: 0.2059, loss_mask: 0.2160, loss: 0.6265 2024-05-31 12:06:15,372 - mmdet - INFO - Epoch [9][1300/7330] lr: 1.000e-05, eta: 6:26:02, time: 0.862, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0370, loss_cls: 0.1531, acc: 94.0686, loss_bbox: 0.2059, loss_mask: 0.2141, loss: 0.6220 2024-05-31 12:06:59,232 - mmdet - INFO - Epoch [9][1350/7330] lr: 1.000e-05, eta: 6:25:22, time: 0.877, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0383, loss_cls: 0.1592, acc: 93.8845, loss_bbox: 0.2078, loss_mask: 0.2101, loss: 0.6282 2024-05-31 12:07:40,202 - mmdet - INFO - Epoch [9][1400/7330] lr: 1.000e-05, eta: 6:24:41, time: 0.819, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0393, loss_cls: 0.1591, acc: 93.9558, loss_bbox: 0.2143, loss_mask: 0.2202, loss: 0.6473 2024-05-31 12:08:20,967 - mmdet - INFO - Epoch [9][1450/7330] lr: 1.000e-05, eta: 6:23:59, time: 0.815, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0383, loss_cls: 0.1553, acc: 93.9775, loss_bbox: 0.2053, loss_mask: 0.2132, loss: 0.6244 2024-05-31 12:09:04,731 - mmdet - INFO - Epoch [9][1500/7330] lr: 1.000e-05, eta: 6:23:19, time: 0.875, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0373, loss_cls: 0.1527, acc: 94.2688, loss_bbox: 0.2006, loss_mask: 0.2114, loss: 0.6155 2024-05-31 12:09:47,927 - mmdet - INFO - Epoch [9][1550/7330] lr: 1.000e-05, eta: 6:22:38, time: 0.864, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0390, loss_cls: 0.1593, acc: 93.8799, loss_bbox: 0.2130, loss_mask: 0.2223, loss: 0.6464 2024-05-31 12:10:28,370 - mmdet - INFO - Epoch [9][1600/7330] lr: 1.000e-05, eta: 6:21:56, time: 0.809, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0359, loss_cls: 0.1474, acc: 94.4524, loss_bbox: 0.1936, loss_mask: 0.2094, loss: 0.5983 2024-05-31 12:11:09,667 - mmdet - INFO - Epoch [9][1650/7330] lr: 1.000e-05, eta: 6:21:15, time: 0.826, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0405, loss_cls: 0.1564, acc: 94.0137, loss_bbox: 0.2093, loss_mask: 0.2144, loss: 0.6334 2024-05-31 12:11:52,445 - mmdet - INFO - Epoch [9][1700/7330] lr: 1.000e-05, eta: 6:20:34, time: 0.856, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0373, loss_cls: 0.1543, acc: 94.1123, loss_bbox: 0.2063, loss_mask: 0.2134, loss: 0.6241 2024-05-31 12:12:34,113 - mmdet - INFO - Epoch [9][1750/7330] lr: 1.000e-05, eta: 6:19:53, time: 0.833, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0371, loss_cls: 0.1572, acc: 94.0183, loss_bbox: 0.2077, loss_mask: 0.2154, loss: 0.6306 2024-05-31 12:13:15,702 - mmdet - INFO - Epoch [9][1800/7330] lr: 1.000e-05, eta: 6:19:12, time: 0.832, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0369, loss_cls: 0.1547, acc: 94.1653, loss_bbox: 0.2028, loss_mask: 0.2185, loss: 0.6260 2024-05-31 12:13:56,290 - mmdet - INFO - Epoch [9][1850/7330] lr: 1.000e-05, eta: 6:18:30, time: 0.812, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0388, loss_cls: 0.1625, acc: 93.8794, loss_bbox: 0.2157, loss_mask: 0.2170, loss: 0.6471 2024-05-31 12:14:37,139 - mmdet - INFO - Epoch [9][1900/7330] lr: 1.000e-05, eta: 6:17:49, time: 0.817, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0379, loss_cls: 0.1577, acc: 93.9543, loss_bbox: 0.2072, loss_mask: 0.2138, loss: 0.6302 2024-05-31 12:15:20,741 - mmdet - INFO - Epoch [9][1950/7330] lr: 1.000e-05, eta: 6:17:08, time: 0.872, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0396, loss_cls: 0.1599, acc: 93.8684, loss_bbox: 0.2146, loss_mask: 0.2223, loss: 0.6494 2024-05-31 12:16:01,423 - mmdet - INFO - Epoch [9][2000/7330] lr: 1.000e-05, eta: 6:16:27, time: 0.814, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0397, loss_cls: 0.1522, acc: 94.0962, loss_bbox: 0.2045, loss_mask: 0.2152, loss: 0.6239 2024-05-31 12:16:42,040 - mmdet - INFO - Epoch [9][2050/7330] lr: 1.000e-05, eta: 6:15:45, time: 0.812, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0374, loss_cls: 0.1569, acc: 94.0532, loss_bbox: 0.2111, loss_mask: 0.2192, loss: 0.6378 2024-05-31 12:17:22,941 - mmdet - INFO - Epoch [9][2100/7330] lr: 1.000e-05, eta: 6:15:04, time: 0.818, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0395, loss_cls: 0.1547, acc: 94.1941, loss_bbox: 0.2093, loss_mask: 0.2149, loss: 0.6333 2024-05-31 12:18:07,329 - mmdet - INFO - Epoch [9][2150/7330] lr: 1.000e-05, eta: 6:14:24, time: 0.888, data_time: 0.073, memory: 18874, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0407, loss_cls: 0.1586, acc: 93.9045, loss_bbox: 0.2126, loss_mask: 0.2217, loss: 0.6474 2024-05-31 12:18:51,404 - mmdet - INFO - Epoch [9][2200/7330] lr: 1.000e-05, eta: 6:13:44, time: 0.882, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0399, loss_cls: 0.1572, acc: 94.0015, loss_bbox: 0.2057, loss_mask: 0.2198, loss: 0.6350 2024-05-31 12:19:31,888 - mmdet - INFO - Epoch [9][2250/7330] lr: 1.000e-05, eta: 6:13:02, time: 0.809, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0386, loss_cls: 0.1598, acc: 93.9475, loss_bbox: 0.2084, loss_mask: 0.2190, loss: 0.6393 2024-05-31 12:20:12,449 - mmdet - INFO - Epoch [9][2300/7330] lr: 1.000e-05, eta: 6:12:20, time: 0.812, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0383, loss_cls: 0.1542, acc: 94.1104, loss_bbox: 0.2030, loss_mask: 0.2169, loss: 0.6270 2024-05-31 12:20:56,500 - mmdet - INFO - Epoch [9][2350/7330] lr: 1.000e-05, eta: 6:11:40, time: 0.881, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0374, loss_cls: 0.1471, acc: 94.3721, loss_bbox: 0.1959, loss_mask: 0.2101, loss: 0.6025 2024-05-31 12:21:40,974 - mmdet - INFO - Epoch [9][2400/7330] lr: 1.000e-05, eta: 6:11:00, time: 0.890, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0386, loss_cls: 0.1610, acc: 93.8914, loss_bbox: 0.2110, loss_mask: 0.2133, loss: 0.6375 2024-05-31 12:22:21,944 - mmdet - INFO - Epoch [9][2450/7330] lr: 1.000e-05, eta: 6:10:19, time: 0.819, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0381, loss_cls: 0.1581, acc: 94.0549, loss_bbox: 0.2085, loss_mask: 0.2171, loss: 0.6360 2024-05-31 12:23:02,804 - mmdet - INFO - Epoch [9][2500/7330] lr: 1.000e-05, eta: 6:09:37, time: 0.817, data_time: 0.072, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0384, loss_cls: 0.1527, acc: 94.3108, loss_bbox: 0.2069, loss_mask: 0.2157, loss: 0.6260 2024-05-31 12:23:46,893 - mmdet - INFO - Epoch [9][2550/7330] lr: 1.000e-05, eta: 6:08:57, time: 0.882, data_time: 0.082, memory: 18874, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0398, loss_cls: 0.1572, acc: 93.9412, loss_bbox: 0.2074, loss_mask: 0.2161, loss: 0.6339 2024-05-31 12:24:30,113 - mmdet - INFO - Epoch [9][2600/7330] lr: 1.000e-05, eta: 6:08:16, time: 0.864, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0389, loss_cls: 0.1555, acc: 93.9958, loss_bbox: 0.2114, loss_mask: 0.2170, loss: 0.6362 2024-05-31 12:25:11,556 - mmdet - INFO - Epoch [9][2650/7330] lr: 1.000e-05, eta: 6:07:35, time: 0.829, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0367, loss_cls: 0.1447, acc: 94.4893, loss_bbox: 0.1988, loss_mask: 0.2107, loss: 0.6024 2024-05-31 12:25:52,205 - mmdet - INFO - Epoch [9][2700/7330] lr: 1.000e-05, eta: 6:06:53, time: 0.813, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0405, loss_cls: 0.1538, acc: 94.1079, loss_bbox: 0.2109, loss_mask: 0.2205, loss: 0.6388 2024-05-31 12:26:36,890 - mmdet - INFO - Epoch [9][2750/7330] lr: 1.000e-05, eta: 6:06:13, time: 0.894, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0401, loss_cls: 0.1585, acc: 94.0608, loss_bbox: 0.2041, loss_mask: 0.2139, loss: 0.6302 2024-05-31 12:27:17,325 - mmdet - INFO - Epoch [9][2800/7330] lr: 1.000e-05, eta: 6:05:32, time: 0.809, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0361, loss_cls: 0.1529, acc: 94.1768, loss_bbox: 0.2032, loss_mask: 0.2153, loss: 0.6206 2024-05-31 12:27:57,757 - mmdet - INFO - Epoch [9][2850/7330] lr: 1.000e-05, eta: 6:04:50, time: 0.809, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0382, loss_cls: 0.1530, acc: 94.1946, loss_bbox: 0.2081, loss_mask: 0.2152, loss: 0.6272 2024-05-31 12:28:38,778 - mmdet - INFO - Epoch [9][2900/7330] lr: 1.000e-05, eta: 6:04:08, time: 0.820, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0390, loss_cls: 0.1518, acc: 94.1421, loss_bbox: 0.2087, loss_mask: 0.2179, loss: 0.6306 2024-05-31 12:29:20,497 - mmdet - INFO - Epoch [9][2950/7330] lr: 1.000e-05, eta: 6:03:27, time: 0.834, data_time: 0.074, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0373, loss_cls: 0.1593, acc: 93.9609, loss_bbox: 0.2110, loss_mask: 0.2174, loss: 0.6372 2024-05-31 12:30:03,249 - mmdet - INFO - Epoch [9][3000/7330] lr: 1.000e-05, eta: 6:02:47, time: 0.855, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0360, loss_cls: 0.1539, acc: 94.1467, loss_bbox: 0.2035, loss_mask: 0.2162, loss: 0.6219 2024-05-31 12:30:44,034 - mmdet - INFO - Epoch [9][3050/7330] lr: 1.000e-05, eta: 6:02:05, time: 0.816, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0377, loss_cls: 0.1524, acc: 94.1641, loss_bbox: 0.2073, loss_mask: 0.2200, loss: 0.6296 2024-05-31 12:31:25,616 - mmdet - INFO - Epoch [9][3100/7330] lr: 1.000e-05, eta: 6:01:24, time: 0.832, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0396, loss_cls: 0.1536, acc: 94.1582, loss_bbox: 0.2043, loss_mask: 0.2116, loss: 0.6220 2024-05-31 12:32:06,544 - mmdet - INFO - Epoch [9][3150/7330] lr: 1.000e-05, eta: 6:00:42, time: 0.819, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0394, loss_cls: 0.1559, acc: 94.0903, loss_bbox: 0.2056, loss_mask: 0.2178, loss: 0.6309 2024-05-31 12:32:49,729 - mmdet - INFO - Epoch [9][3200/7330] lr: 1.000e-05, eta: 6:00:02, time: 0.864, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0394, loss_cls: 0.1530, acc: 94.1404, loss_bbox: 0.2013, loss_mask: 0.2147, loss: 0.6204 2024-05-31 12:33:34,786 - mmdet - INFO - Epoch [9][3250/7330] lr: 1.000e-05, eta: 5:59:22, time: 0.901, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0374, loss_cls: 0.1526, acc: 94.2053, loss_bbox: 0.2028, loss_mask: 0.2121, loss: 0.6171 2024-05-31 12:34:17,106 - mmdet - INFO - Epoch [9][3300/7330] lr: 1.000e-05, eta: 5:58:41, time: 0.846, data_time: 0.080, memory: 18874, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0425, loss_cls: 0.1626, acc: 93.6931, loss_bbox: 0.2140, loss_mask: 0.2210, loss: 0.6544 2024-05-31 12:34:59,158 - mmdet - INFO - Epoch [9][3350/7330] lr: 1.000e-05, eta: 5:58:00, time: 0.841, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0430, loss_cls: 0.1641, acc: 93.7551, loss_bbox: 0.2199, loss_mask: 0.2225, loss: 0.6644 2024-05-31 12:35:42,297 - mmdet - INFO - Epoch [9][3400/7330] lr: 1.000e-05, eta: 5:57:19, time: 0.863, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0363, loss_cls: 0.1490, acc: 94.3240, loss_bbox: 0.1935, loss_mask: 0.2087, loss: 0.5996 2024-05-31 12:36:25,298 - mmdet - INFO - Epoch [9][3450/7330] lr: 1.000e-05, eta: 5:56:38, time: 0.860, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0378, loss_cls: 0.1535, acc: 94.1309, loss_bbox: 0.2055, loss_mask: 0.2154, loss: 0.6252 2024-05-31 12:37:06,349 - mmdet - INFO - Epoch [9][3500/7330] lr: 1.000e-05, eta: 5:55:57, time: 0.821, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0408, loss_cls: 0.1605, acc: 93.8267, loss_bbox: 0.2131, loss_mask: 0.2208, loss: 0.6495 2024-05-31 12:37:46,637 - mmdet - INFO - Epoch [9][3550/7330] lr: 1.000e-05, eta: 5:55:15, time: 0.806, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0353, loss_cls: 0.1423, acc: 94.6643, loss_bbox: 0.1886, loss_mask: 0.2054, loss: 0.5827 2024-05-31 12:38:29,970 - mmdet - INFO - Epoch [9][3600/7330] lr: 1.000e-05, eta: 5:54:35, time: 0.867, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0404, loss_cls: 0.1545, acc: 94.1348, loss_bbox: 0.2059, loss_mask: 0.2155, loss: 0.6285 2024-05-31 12:39:12,519 - mmdet - INFO - Epoch [9][3650/7330] lr: 1.000e-05, eta: 5:53:54, time: 0.851, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0365, loss_cls: 0.1480, acc: 94.3455, loss_bbox: 0.2003, loss_mask: 0.2122, loss: 0.6099 2024-05-31 12:39:53,572 - mmdet - INFO - Epoch [9][3700/7330] lr: 1.000e-05, eta: 5:53:12, time: 0.821, data_time: 0.079, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0399, loss_cls: 0.1577, acc: 93.9233, loss_bbox: 0.2126, loss_mask: 0.2194, loss: 0.6424 2024-05-31 12:40:34,854 - mmdet - INFO - Epoch [9][3750/7330] lr: 1.000e-05, eta: 5:52:31, time: 0.826, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0396, loss_cls: 0.1544, acc: 94.0728, loss_bbox: 0.2097, loss_mask: 0.2165, loss: 0.6343 2024-05-31 12:41:18,929 - mmdet - INFO - Epoch [9][3800/7330] lr: 1.000e-05, eta: 5:51:51, time: 0.881, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0403, loss_cls: 0.1574, acc: 93.9194, loss_bbox: 0.2116, loss_mask: 0.2187, loss: 0.6418 2024-05-31 12:42:00,012 - mmdet - INFO - Epoch [9][3850/7330] lr: 1.000e-05, eta: 5:51:09, time: 0.822, data_time: 0.077, memory: 18874, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0393, loss_cls: 0.1537, acc: 94.1152, loss_bbox: 0.2090, loss_mask: 0.2142, loss: 0.6290 2024-05-31 12:42:41,136 - mmdet - INFO - Epoch [9][3900/7330] lr: 1.000e-05, eta: 5:50:28, time: 0.823, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0391, loss_cls: 0.1561, acc: 94.0034, loss_bbox: 0.2075, loss_mask: 0.2117, loss: 0.6272 2024-05-31 12:43:22,195 - mmdet - INFO - Epoch [9][3950/7330] lr: 1.000e-05, eta: 5:49:46, time: 0.821, data_time: 0.080, memory: 18874, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0413, loss_cls: 0.1619, acc: 93.8098, loss_bbox: 0.2215, loss_mask: 0.2206, loss: 0.6598 2024-05-31 12:44:02,958 - mmdet - INFO - Epoch [9][4000/7330] lr: 1.000e-05, eta: 5:49:05, time: 0.815, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0361, loss_cls: 0.1468, acc: 94.4231, loss_bbox: 0.1991, loss_mask: 0.2089, loss: 0.6032 2024-05-31 12:44:45,343 - mmdet - INFO - Epoch [9][4050/7330] lr: 1.000e-05, eta: 5:48:24, time: 0.848, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0351, loss_cls: 0.1390, acc: 94.7441, loss_bbox: 0.1887, loss_mask: 0.2107, loss: 0.5854 2024-05-31 12:45:26,107 - mmdet - INFO - Epoch [9][4100/7330] lr: 1.000e-05, eta: 5:47:42, time: 0.815, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0401, loss_cls: 0.1621, acc: 93.6875, loss_bbox: 0.2161, loss_mask: 0.2190, loss: 0.6500 2024-05-31 12:46:07,149 - mmdet - INFO - Epoch [9][4150/7330] lr: 1.000e-05, eta: 5:47:01, time: 0.821, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0387, loss_cls: 0.1622, acc: 93.8220, loss_bbox: 0.2134, loss_mask: 0.2178, loss: 0.6450 2024-05-31 12:46:48,261 - mmdet - INFO - Epoch [9][4200/7330] lr: 1.000e-05, eta: 5:46:19, time: 0.822, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0404, loss_cls: 0.1586, acc: 94.0527, loss_bbox: 0.2103, loss_mask: 0.2196, loss: 0.6426 2024-05-31 12:47:32,184 - mmdet - INFO - Epoch [9][4250/7330] lr: 1.000e-05, eta: 5:45:39, time: 0.878, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0418, loss_cls: 0.1601, acc: 93.8589, loss_bbox: 0.2181, loss_mask: 0.2184, loss: 0.6526 2024-05-31 12:48:15,252 - mmdet - INFO - Epoch [9][4300/7330] lr: 1.000e-05, eta: 5:44:58, time: 0.861, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0369, loss_cls: 0.1536, acc: 94.1292, loss_bbox: 0.2048, loss_mask: 0.2166, loss: 0.6231 2024-05-31 12:48:56,032 - mmdet - INFO - Epoch [9][4350/7330] lr: 1.000e-05, eta: 5:44:17, time: 0.816, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0363, loss_cls: 0.1512, acc: 94.2231, loss_bbox: 0.2046, loss_mask: 0.2160, loss: 0.6211 2024-05-31 12:49:36,943 - mmdet - INFO - Epoch [9][4400/7330] lr: 1.000e-05, eta: 5:43:35, time: 0.818, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0375, loss_cls: 0.1600, acc: 93.9070, loss_bbox: 0.2123, loss_mask: 0.2235, loss: 0.6464 2024-05-31 12:50:23,979 - mmdet - INFO - Epoch [9][4450/7330] lr: 1.000e-05, eta: 5:42:56, time: 0.941, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0365, loss_cls: 0.1473, acc: 94.4294, loss_bbox: 0.1979, loss_mask: 0.2106, loss: 0.6053 2024-05-31 12:51:04,431 - mmdet - INFO - Epoch [9][4500/7330] lr: 1.000e-05, eta: 5:42:14, time: 0.809, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0393, loss_cls: 0.1588, acc: 93.8923, loss_bbox: 0.2124, loss_mask: 0.2174, loss: 0.6413 2024-05-31 12:51:44,788 - mmdet - INFO - Epoch [9][4550/7330] lr: 1.000e-05, eta: 5:41:32, time: 0.807, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0381, loss_cls: 0.1540, acc: 94.1719, loss_bbox: 0.2089, loss_mask: 0.2142, loss: 0.6281 2024-05-31 12:52:25,157 - mmdet - INFO - Epoch [9][4600/7330] lr: 1.000e-05, eta: 5:40:51, time: 0.807, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0365, loss_cls: 0.1504, acc: 94.2844, loss_bbox: 0.2034, loss_mask: 0.2133, loss: 0.6154 2024-05-31 12:53:07,961 - mmdet - INFO - Epoch [9][4650/7330] lr: 1.000e-05, eta: 5:40:10, time: 0.856, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0383, loss_cls: 0.1566, acc: 94.0710, loss_bbox: 0.2104, loss_mask: 0.2135, loss: 0.6321 2024-05-31 12:53:51,597 - mmdet - INFO - Epoch [9][4700/7330] lr: 1.000e-05, eta: 5:39:29, time: 0.873, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0420, loss_cls: 0.1707, acc: 93.4458, loss_bbox: 0.2245, loss_mask: 0.2262, loss: 0.6788 2024-05-31 12:54:32,095 - mmdet - INFO - Epoch [9][4750/7330] lr: 1.000e-05, eta: 5:38:48, time: 0.810, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0366, loss_cls: 0.1462, acc: 94.4502, loss_bbox: 0.1986, loss_mask: 0.2116, loss: 0.6043 2024-05-31 12:55:12,407 - mmdet - INFO - Epoch [9][4800/7330] lr: 1.000e-05, eta: 5:38:06, time: 0.806, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0394, loss_cls: 0.1508, acc: 94.2148, loss_bbox: 0.2046, loss_mask: 0.2135, loss: 0.6204 2024-05-31 12:55:55,540 - mmdet - INFO - Epoch [9][4850/7330] lr: 1.000e-05, eta: 5:37:25, time: 0.863, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0357, loss_cls: 0.1481, acc: 94.3542, loss_bbox: 0.2020, loss_mask: 0.2142, loss: 0.6127 2024-05-31 12:56:36,417 - mmdet - INFO - Epoch [9][4900/7330] lr: 1.000e-05, eta: 5:36:44, time: 0.817, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0385, loss_cls: 0.1547, acc: 94.0742, loss_bbox: 0.2119, loss_mask: 0.2226, loss: 0.6402 2024-05-31 12:57:16,989 - mmdet - INFO - Epoch [9][4950/7330] lr: 1.000e-05, eta: 5:36:02, time: 0.812, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0358, loss_cls: 0.1505, acc: 94.2588, loss_bbox: 0.1988, loss_mask: 0.2145, loss: 0.6109 2024-05-31 12:57:57,493 - mmdet - INFO - Epoch [9][5000/7330] lr: 1.000e-05, eta: 5:35:20, time: 0.810, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0396, loss_cls: 0.1586, acc: 93.9341, loss_bbox: 0.2147, loss_mask: 0.2158, loss: 0.6409 2024-05-31 12:58:38,588 - mmdet - INFO - Epoch [9][5050/7330] lr: 1.000e-05, eta: 5:34:39, time: 0.822, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0399, loss_cls: 0.1605, acc: 93.8604, loss_bbox: 0.2162, loss_mask: 0.2197, loss: 0.6504 2024-05-31 12:59:22,502 - mmdet - INFO - Epoch [9][5100/7330] lr: 1.000e-05, eta: 5:33:58, time: 0.878, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0422, loss_cls: 0.1619, acc: 93.8032, loss_bbox: 0.2190, loss_mask: 0.2207, loss: 0.6585 2024-05-31 13:00:03,242 - mmdet - INFO - Epoch [9][5150/7330] lr: 1.000e-05, eta: 5:33:17, time: 0.815, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0362, loss_cls: 0.1462, acc: 94.3474, loss_bbox: 0.2001, loss_mask: 0.2106, loss: 0.6050 2024-05-31 13:00:43,622 - mmdet - INFO - Epoch [9][5200/7330] lr: 1.000e-05, eta: 5:32:35, time: 0.808, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0364, loss_cls: 0.1521, acc: 94.2383, loss_bbox: 0.2070, loss_mask: 0.2179, loss: 0.6268 2024-05-31 13:01:24,654 - mmdet - INFO - Epoch [9][5250/7330] lr: 1.000e-05, eta: 5:31:53, time: 0.821, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0376, loss_cls: 0.1484, acc: 94.3147, loss_bbox: 0.1993, loss_mask: 0.2158, loss: 0.6139 2024-05-31 13:02:07,774 - mmdet - INFO - Epoch [9][5300/7330] lr: 1.000e-05, eta: 5:31:13, time: 0.862, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0375, loss_cls: 0.1562, acc: 94.0342, loss_bbox: 0.2046, loss_mask: 0.2164, loss: 0.6277 2024-05-31 13:02:52,944 - mmdet - INFO - Epoch [9][5350/7330] lr: 1.000e-05, eta: 5:30:33, time: 0.903, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0381, loss_cls: 0.1589, acc: 93.8364, loss_bbox: 0.2121, loss_mask: 0.2211, loss: 0.6427 2024-05-31 13:03:34,745 - mmdet - INFO - Epoch [9][5400/7330] lr: 1.000e-05, eta: 5:29:52, time: 0.836, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0381, loss_cls: 0.1547, acc: 94.0901, loss_bbox: 0.2076, loss_mask: 0.2155, loss: 0.6292 2024-05-31 13:04:15,709 - mmdet - INFO - Epoch [9][5450/7330] lr: 1.000e-05, eta: 5:29:10, time: 0.819, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0380, loss_cls: 0.1542, acc: 94.1423, loss_bbox: 0.2097, loss_mask: 0.2203, loss: 0.6351 2024-05-31 13:05:00,901 - mmdet - INFO - Epoch [9][5500/7330] lr: 1.000e-05, eta: 5:28:30, time: 0.904, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0359, loss_cls: 0.1471, acc: 94.3857, loss_bbox: 0.1975, loss_mask: 0.2101, loss: 0.6029 2024-05-31 13:05:42,198 - mmdet - INFO - Epoch [9][5550/7330] lr: 1.000e-05, eta: 5:27:49, time: 0.826, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0388, loss_cls: 0.1531, acc: 94.1692, loss_bbox: 0.2063, loss_mask: 0.2151, loss: 0.6266 2024-05-31 13:06:22,729 - mmdet - INFO - Epoch [9][5600/7330] lr: 1.000e-05, eta: 5:27:07, time: 0.811, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0375, loss_cls: 0.1555, acc: 93.9863, loss_bbox: 0.2132, loss_mask: 0.2192, loss: 0.6388 2024-05-31 13:07:03,707 - mmdet - INFO - Epoch [9][5650/7330] lr: 1.000e-05, eta: 5:26:26, time: 0.820, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0362, loss_cls: 0.1507, acc: 94.3162, loss_bbox: 0.2005, loss_mask: 0.2162, loss: 0.6157 2024-05-31 13:07:47,268 - mmdet - INFO - Epoch [9][5700/7330] lr: 1.000e-05, eta: 5:25:45, time: 0.871, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0391, loss_cls: 0.1549, acc: 94.0610, loss_bbox: 0.2055, loss_mask: 0.2168, loss: 0.6291 2024-05-31 13:08:30,553 - mmdet - INFO - Epoch [9][5750/7330] lr: 1.000e-05, eta: 5:25:04, time: 0.866, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0355, loss_cls: 0.1486, acc: 94.3376, loss_bbox: 0.2017, loss_mask: 0.2111, loss: 0.6098 2024-05-31 13:09:11,796 - mmdet - INFO - Epoch [9][5800/7330] lr: 1.000e-05, eta: 5:24:23, time: 0.825, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0381, loss_cls: 0.1485, acc: 94.3325, loss_bbox: 0.2004, loss_mask: 0.2125, loss: 0.6114 2024-05-31 13:09:53,195 - mmdet - INFO - Epoch [9][5850/7330] lr: 1.000e-05, eta: 5:23:42, time: 0.828, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0388, loss_cls: 0.1511, acc: 94.1912, loss_bbox: 0.2029, loss_mask: 0.2159, loss: 0.6211 2024-05-31 13:10:37,018 - mmdet - INFO - Epoch [9][5900/7330] lr: 1.000e-05, eta: 5:23:01, time: 0.876, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0411, loss_cls: 0.1623, acc: 93.8267, loss_bbox: 0.2132, loss_mask: 0.2213, loss: 0.6520 2024-05-31 13:11:17,346 - mmdet - INFO - Epoch [9][5950/7330] lr: 1.000e-05, eta: 5:22:19, time: 0.807, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0391, loss_cls: 0.1517, acc: 94.1770, loss_bbox: 0.2065, loss_mask: 0.2132, loss: 0.6239 2024-05-31 13:11:57,447 - mmdet - INFO - Epoch [9][6000/7330] lr: 1.000e-05, eta: 5:21:37, time: 0.802, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0372, loss_cls: 0.1526, acc: 94.1719, loss_bbox: 0.2083, loss_mask: 0.2180, loss: 0.6286 2024-05-31 13:12:38,105 - mmdet - INFO - Epoch [9][6050/7330] lr: 1.000e-05, eta: 5:20:56, time: 0.813, data_time: 0.067, memory: 18874, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0365, loss_cls: 0.1486, acc: 94.3831, loss_bbox: 0.1993, loss_mask: 0.2129, loss: 0.6093 2024-05-31 13:13:21,533 - mmdet - INFO - Epoch [9][6100/7330] lr: 1.000e-05, eta: 5:20:15, time: 0.869, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0397, loss_cls: 0.1594, acc: 93.9321, loss_bbox: 0.2101, loss_mask: 0.2203, loss: 0.6429 2024-05-31 13:14:02,270 - mmdet - INFO - Epoch [9][6150/7330] lr: 1.000e-05, eta: 5:19:34, time: 0.815, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0356, loss_cls: 0.1544, acc: 94.1770, loss_bbox: 0.2011, loss_mask: 0.2105, loss: 0.6134 2024-05-31 13:14:42,953 - mmdet - INFO - Epoch [9][6200/7330] lr: 1.000e-05, eta: 5:18:52, time: 0.814, data_time: 0.042, memory: 18874, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0378, loss_cls: 0.1558, acc: 94.1436, loss_bbox: 0.2094, loss_mask: 0.2155, loss: 0.6318 2024-05-31 13:15:23,708 - mmdet - INFO - Epoch [9][6250/7330] lr: 1.000e-05, eta: 5:18:10, time: 0.815, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0377, loss_cls: 0.1531, acc: 94.1174, loss_bbox: 0.2038, loss_mask: 0.2147, loss: 0.6231 2024-05-31 13:16:04,304 - mmdet - INFO - Epoch [9][6300/7330] lr: 1.000e-05, eta: 5:17:29, time: 0.812, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0363, loss_cls: 0.1401, acc: 94.6682, loss_bbox: 0.1918, loss_mask: 0.2118, loss: 0.5912 2024-05-31 13:16:47,819 - mmdet - INFO - Epoch [9][6350/7330] lr: 1.000e-05, eta: 5:16:48, time: 0.870, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0397, loss_cls: 0.1590, acc: 93.8379, loss_bbox: 0.2166, loss_mask: 0.2210, loss: 0.6508 2024-05-31 13:17:32,659 - mmdet - INFO - Epoch [9][6400/7330] lr: 1.000e-05, eta: 5:16:08, time: 0.897, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0388, loss_cls: 0.1482, acc: 94.3674, loss_bbox: 0.2025, loss_mask: 0.2083, loss: 0.6097 2024-05-31 13:18:13,888 - mmdet - INFO - Epoch [9][6450/7330] lr: 1.000e-05, eta: 5:15:26, time: 0.825, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0364, loss_cls: 0.1524, acc: 94.2378, loss_bbox: 0.2033, loss_mask: 0.2117, loss: 0.6149 2024-05-31 13:18:54,681 - mmdet - INFO - Epoch [9][6500/7330] lr: 1.000e-05, eta: 5:14:45, time: 0.816, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0372, loss_cls: 0.1506, acc: 94.1780, loss_bbox: 0.2045, loss_mask: 0.2095, loss: 0.6139 2024-05-31 13:19:39,662 - mmdet - INFO - Epoch [9][6550/7330] lr: 1.000e-05, eta: 5:14:05, time: 0.900, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0382, loss_cls: 0.1585, acc: 93.9897, loss_bbox: 0.2087, loss_mask: 0.2163, loss: 0.6353 2024-05-31 13:20:20,757 - mmdet - INFO - Epoch [9][6600/7330] lr: 1.000e-05, eta: 5:13:23, time: 0.822, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0381, loss_cls: 0.1537, acc: 94.1104, loss_bbox: 0.2049, loss_mask: 0.2150, loss: 0.6245 2024-05-31 13:21:01,562 - mmdet - INFO - Epoch [9][6650/7330] lr: 1.000e-05, eta: 5:12:42, time: 0.816, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0384, loss_cls: 0.1555, acc: 94.1025, loss_bbox: 0.2024, loss_mask: 0.2161, loss: 0.6263 2024-05-31 13:21:42,018 - mmdet - INFO - Epoch [9][6700/7330] lr: 1.000e-05, eta: 5:12:00, time: 0.809, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0357, loss_cls: 0.1503, acc: 94.3137, loss_bbox: 0.2007, loss_mask: 0.2114, loss: 0.6111 2024-05-31 13:22:25,290 - mmdet - INFO - Epoch [9][6750/7330] lr: 1.000e-05, eta: 5:11:19, time: 0.865, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0345, loss_cls: 0.1415, acc: 94.5879, loss_bbox: 0.1922, loss_mask: 0.2129, loss: 0.5911 2024-05-31 13:23:08,018 - mmdet - INFO - Epoch [9][6800/7330] lr: 1.000e-05, eta: 5:10:38, time: 0.855, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0379, loss_cls: 0.1526, acc: 94.1655, loss_bbox: 0.2065, loss_mask: 0.2115, loss: 0.6211 2024-05-31 13:23:48,931 - mmdet - INFO - Epoch [9][6850/7330] lr: 1.000e-05, eta: 5:09:57, time: 0.818, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0406, loss_cls: 0.1587, acc: 94.0112, loss_bbox: 0.2119, loss_mask: 0.2196, loss: 0.6450 2024-05-31 13:24:29,451 - mmdet - INFO - Epoch [9][6900/7330] lr: 1.000e-05, eta: 5:09:15, time: 0.810, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0385, loss_cls: 0.1528, acc: 94.2153, loss_bbox: 0.2035, loss_mask: 0.2153, loss: 0.6220 2024-05-31 13:25:11,883 - mmdet - INFO - Epoch [9][6950/7330] lr: 1.000e-05, eta: 5:08:34, time: 0.849, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0374, loss_cls: 0.1499, acc: 94.3369, loss_bbox: 0.1983, loss_mask: 0.2133, loss: 0.6107 2024-05-31 13:25:52,395 - mmdet - INFO - Epoch [9][7000/7330] lr: 1.000e-05, eta: 5:07:52, time: 0.810, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0384, loss_cls: 0.1526, acc: 94.2017, loss_bbox: 0.2044, loss_mask: 0.2130, loss: 0.6209 2024-05-31 13:26:33,302 - mmdet - INFO - Epoch [9][7050/7330] lr: 1.000e-05, eta: 5:07:11, time: 0.818, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0387, loss_cls: 0.1552, acc: 94.1731, loss_bbox: 0.2042, loss_mask: 0.2146, loss: 0.6253 2024-05-31 13:27:14,200 - mmdet - INFO - Epoch [9][7100/7330] lr: 1.000e-05, eta: 5:06:29, time: 0.818, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0389, loss_cls: 0.1553, acc: 94.0774, loss_bbox: 0.2036, loss_mask: 0.2156, loss: 0.6257 2024-05-31 13:27:57,211 - mmdet - INFO - Epoch [9][7150/7330] lr: 1.000e-05, eta: 5:05:48, time: 0.860, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0422, loss_cls: 0.1675, acc: 93.5933, loss_bbox: 0.2275, loss_mask: 0.2266, loss: 0.6778 2024-05-31 13:28:38,086 - mmdet - INFO - Epoch [9][7200/7330] lr: 1.000e-05, eta: 5:05:07, time: 0.817, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0390, loss_cls: 0.1518, acc: 94.0986, loss_bbox: 0.2097, loss_mask: 0.2194, loss: 0.6321 2024-05-31 13:29:18,849 - mmdet - INFO - Epoch [9][7250/7330] lr: 1.000e-05, eta: 5:04:25, time: 0.816, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0389, loss_cls: 0.1562, acc: 94.0857, loss_bbox: 0.2067, loss_mask: 0.2140, loss: 0.6281 2024-05-31 13:29:59,327 - mmdet - INFO - Epoch [9][7300/7330] lr: 1.000e-05, eta: 5:03:44, time: 0.809, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0378, loss_cls: 0.1475, acc: 94.4175, loss_bbox: 0.1990, loss_mask: 0.2144, loss: 0.6106 2024-05-31 13:30:24,668 - mmdet - INFO - Saving checkpoint at 9 epochs 2024-05-31 13:32:12,589 - mmdet - INFO - Evaluating bbox... 2024-05-31 13:32:34,151 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.489 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.714 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.538 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.306 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.535 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.663 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.602 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.602 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.602 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.415 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.647 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.767 2024-05-31 13:32:34,151 - mmdet - INFO - Evaluating segm... 2024-05-31 13:32:56,063 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.434 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.678 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.465 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.217 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.470 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.649 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.540 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.540 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.540 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.339 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.585 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.728 2024-05-31 13:32:56,378 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_896_672_fpn_1x_coco_bs16.py 2024-05-31 13:32:56,380 - mmdet - INFO - Epoch(val) [9][625] bbox_mAP: 0.4890, bbox_mAP_50: 0.7140, bbox_mAP_75: 0.5380, bbox_mAP_s: 0.3060, bbox_mAP_m: 0.5350, bbox_mAP_l: 0.6630, bbox_mAP_copypaste: 0.489 0.714 0.538 0.306 0.535 0.663, segm_mAP: 0.4340, segm_mAP_50: 0.6780, segm_mAP_75: 0.4650, segm_mAP_s: 0.2170, segm_mAP_m: 0.4700, segm_mAP_l: 0.6490, segm_mAP_copypaste: 0.434 0.678 0.465 0.217 0.470 0.649 2024-05-31 13:33:50,833 - mmdet - INFO - Epoch [10][50/7330] lr: 1.000e-05, eta: 5:02:34, time: 1.089, data_time: 0.139, memory: 18874, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0374, loss_cls: 0.1465, acc: 94.3923, loss_bbox: 0.1999, loss_mask: 0.2136, loss: 0.6103 2024-05-31 13:34:34,859 - mmdet - INFO - Epoch [10][100/7330] lr: 1.000e-05, eta: 5:01:53, time: 0.881, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0388, loss_cls: 0.1497, acc: 94.3196, loss_bbox: 0.2053, loss_mask: 0.2199, loss: 0.6262 2024-05-31 13:35:16,470 - mmdet - INFO - Epoch [10][150/7330] lr: 1.000e-05, eta: 5:01:12, time: 0.832, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0369, loss_cls: 0.1539, acc: 94.1570, loss_bbox: 0.2071, loss_mask: 0.2186, loss: 0.6282 2024-05-31 13:36:00,390 - mmdet - INFO - Epoch [10][200/7330] lr: 1.000e-05, eta: 5:00:31, time: 0.878, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0383, loss_cls: 0.1540, acc: 94.0913, loss_bbox: 0.2103, loss_mask: 0.2185, loss: 0.6332 2024-05-31 13:36:40,878 - mmdet - INFO - Epoch [10][250/7330] lr: 1.000e-05, eta: 4:59:50, time: 0.810, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0347, loss_cls: 0.1381, acc: 94.7905, loss_bbox: 0.1891, loss_mask: 0.2058, loss: 0.5789 2024-05-31 13:37:21,173 - mmdet - INFO - Epoch [10][300/7330] lr: 1.000e-05, eta: 4:59:08, time: 0.806, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0343, loss_cls: 0.1446, acc: 94.4639, loss_bbox: 0.1959, loss_mask: 0.2086, loss: 0.5937 2024-05-31 13:38:02,276 - mmdet - INFO - Epoch [10][350/7330] lr: 1.000e-05, eta: 4:58:26, time: 0.822, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0372, loss_cls: 0.1454, acc: 94.4211, loss_bbox: 0.1933, loss_mask: 0.2092, loss: 0.5973 2024-05-31 13:38:43,871 - mmdet - INFO - Epoch [10][400/7330] lr: 1.000e-05, eta: 4:57:45, time: 0.832, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0388, loss_cls: 0.1585, acc: 93.9546, loss_bbox: 0.2072, loss_mask: 0.2118, loss: 0.6280 2024-05-31 13:39:24,997 - mmdet - INFO - Epoch [10][450/7330] lr: 1.000e-05, eta: 4:57:04, time: 0.823, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0374, loss_cls: 0.1486, acc: 94.2717, loss_bbox: 0.2031, loss_mask: 0.2107, loss: 0.6120 2024-05-31 13:40:05,815 - mmdet - INFO - Epoch [10][500/7330] lr: 1.000e-05, eta: 4:56:22, time: 0.816, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0387, loss_cls: 0.1528, acc: 94.1990, loss_bbox: 0.2092, loss_mask: 0.2167, loss: 0.6299 2024-05-31 13:40:50,305 - mmdet - INFO - Epoch [10][550/7330] lr: 1.000e-05, eta: 4:55:42, time: 0.890, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0353, loss_cls: 0.1491, acc: 94.2668, loss_bbox: 0.1980, loss_mask: 0.2099, loss: 0.6045 2024-05-31 13:41:33,728 - mmdet - INFO - Epoch [10][600/7330] lr: 1.000e-05, eta: 4:55:01, time: 0.868, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0380, loss_cls: 0.1501, acc: 94.2451, loss_bbox: 0.2066, loss_mask: 0.2161, loss: 0.6230 2024-05-31 13:42:14,750 - mmdet - INFO - Epoch [10][650/7330] lr: 1.000e-05, eta: 4:54:19, time: 0.820, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0369, loss_cls: 0.1442, acc: 94.5024, loss_bbox: 0.1922, loss_mask: 0.2099, loss: 0.5948 2024-05-31 13:42:55,586 - mmdet - INFO - Epoch [10][700/7330] lr: 1.000e-05, eta: 4:53:38, time: 0.817, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0378, loss_cls: 0.1510, acc: 94.2808, loss_bbox: 0.2013, loss_mask: 0.2113, loss: 0.6128 2024-05-31 13:43:36,228 - mmdet - INFO - Epoch [10][750/7330] lr: 1.000e-05, eta: 4:52:56, time: 0.813, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0342, loss_cls: 0.1423, acc: 94.5720, loss_bbox: 0.1919, loss_mask: 0.2091, loss: 0.5891 2024-05-31 13:44:17,480 - mmdet - INFO - Epoch [10][800/7330] lr: 1.000e-05, eta: 4:52:15, time: 0.825, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0372, loss_cls: 0.1497, acc: 94.2217, loss_bbox: 0.2073, loss_mask: 0.2149, loss: 0.6219 2024-05-31 13:44:58,004 - mmdet - INFO - Epoch [10][850/7330] lr: 1.000e-05, eta: 4:51:33, time: 0.811, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0375, loss_cls: 0.1515, acc: 94.1282, loss_bbox: 0.2053, loss_mask: 0.2157, loss: 0.6215 2024-05-31 13:45:39,055 - mmdet - INFO - Epoch [10][900/7330] lr: 1.000e-05, eta: 4:50:52, time: 0.821, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0374, loss_cls: 0.1452, acc: 94.4067, loss_bbox: 0.2001, loss_mask: 0.2128, loss: 0.6069 2024-05-31 13:46:20,406 - mmdet - INFO - Epoch [10][950/7330] lr: 1.000e-05, eta: 4:50:10, time: 0.827, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0371, loss_cls: 0.1469, acc: 94.4475, loss_bbox: 0.1970, loss_mask: 0.2110, loss: 0.6039 2024-05-31 13:47:02,075 - mmdet - INFO - Epoch [10][1000/7330] lr: 1.000e-05, eta: 4:49:29, time: 0.833, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0359, loss_cls: 0.1533, acc: 94.1460, loss_bbox: 0.2024, loss_mask: 0.2145, loss: 0.6186 2024-05-31 13:47:46,072 - mmdet - INFO - Epoch [10][1050/7330] lr: 1.000e-05, eta: 4:48:49, time: 0.880, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0359, loss_cls: 0.1472, acc: 94.3489, loss_bbox: 0.2001, loss_mask: 0.2095, loss: 0.6045 2024-05-31 13:48:27,823 - mmdet - INFO - Epoch [10][1100/7330] lr: 1.000e-05, eta: 4:48:07, time: 0.835, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0389, loss_cls: 0.1540, acc: 94.2048, loss_bbox: 0.2031, loss_mask: 0.2095, loss: 0.6185 2024-05-31 13:49:14,793 - mmdet - INFO - Epoch [10][1150/7330] lr: 1.000e-05, eta: 4:47:28, time: 0.939, data_time: 0.074, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0368, loss_cls: 0.1505, acc: 94.2058, loss_bbox: 0.2040, loss_mask: 0.2128, loss: 0.6165 2024-05-31 13:50:01,326 - mmdet - INFO - Epoch [10][1200/7330] lr: 1.000e-05, eta: 4:46:48, time: 0.930, data_time: 0.078, memory: 18874, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0408, loss_cls: 0.1612, acc: 93.8091, loss_bbox: 0.2163, loss_mask: 0.2169, loss: 0.6477 2024-05-31 13:50:45,094 - mmdet - INFO - Epoch [10][1250/7330] lr: 1.000e-05, eta: 4:46:07, time: 0.876, data_time: 0.074, memory: 18874, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0369, loss_cls: 0.1538, acc: 94.1213, loss_bbox: 0.2060, loss_mask: 0.2156, loss: 0.6255 2024-05-31 13:51:25,922 - mmdet - INFO - Epoch [10][1300/7330] lr: 1.000e-05, eta: 4:45:26, time: 0.816, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0363, loss_cls: 0.1449, acc: 94.4993, loss_bbox: 0.1950, loss_mask: 0.2118, loss: 0.5989 2024-05-31 13:52:07,321 - mmdet - INFO - Epoch [10][1350/7330] lr: 1.000e-05, eta: 4:44:44, time: 0.828, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0391, loss_cls: 0.1488, acc: 94.2605, loss_bbox: 0.2004, loss_mask: 0.2073, loss: 0.6074 2024-05-31 13:52:50,036 - mmdet - INFO - Epoch [10][1400/7330] lr: 1.000e-05, eta: 4:44:03, time: 0.854, data_time: 0.083, memory: 18874, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0390, loss_cls: 0.1583, acc: 93.9153, loss_bbox: 0.2175, loss_mask: 0.2240, loss: 0.6520 2024-05-31 13:53:30,723 - mmdet - INFO - Epoch [10][1450/7330] lr: 1.000e-05, eta: 4:43:22, time: 0.814, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0348, loss_cls: 0.1399, acc: 94.6980, loss_bbox: 0.1938, loss_mask: 0.2123, loss: 0.5931 2024-05-31 13:54:11,858 - mmdet - INFO - Epoch [10][1500/7330] lr: 1.000e-05, eta: 4:42:40, time: 0.823, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0375, loss_cls: 0.1503, acc: 94.2776, loss_bbox: 0.2048, loss_mask: 0.2149, loss: 0.6199 2024-05-31 13:54:53,341 - mmdet - INFO - Epoch [10][1550/7330] lr: 1.000e-05, eta: 4:41:59, time: 0.830, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0389, loss_cls: 0.1511, acc: 94.1855, loss_bbox: 0.2057, loss_mask: 0.2172, loss: 0.6252 2024-05-31 13:55:36,739 - mmdet - INFO - Epoch [10][1600/7330] lr: 1.000e-05, eta: 4:41:18, time: 0.867, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0346, loss_cls: 0.1451, acc: 94.4666, loss_bbox: 0.1932, loss_mask: 0.2100, loss: 0.5950 2024-05-31 13:56:20,364 - mmdet - INFO - Epoch [10][1650/7330] lr: 1.000e-05, eta: 4:40:37, time: 0.873, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0353, loss_cls: 0.1441, acc: 94.4727, loss_bbox: 0.1994, loss_mask: 0.2066, loss: 0.5970 2024-05-31 13:57:01,640 - mmdet - INFO - Epoch [10][1700/7330] lr: 1.000e-05, eta: 4:39:56, time: 0.826, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0370, loss_cls: 0.1527, acc: 94.1584, loss_bbox: 0.2045, loss_mask: 0.2159, loss: 0.6225 2024-05-31 13:57:43,068 - mmdet - INFO - Epoch [10][1750/7330] lr: 1.000e-05, eta: 4:39:15, time: 0.828, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0387, loss_cls: 0.1502, acc: 94.2444, loss_bbox: 0.2033, loss_mask: 0.2079, loss: 0.6119 2024-05-31 13:58:23,693 - mmdet - INFO - Epoch [10][1800/7330] lr: 1.000e-05, eta: 4:38:33, time: 0.813, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0359, loss_cls: 0.1470, acc: 94.3801, loss_bbox: 0.1997, loss_mask: 0.2160, loss: 0.6099 2024-05-31 13:59:03,880 - mmdet - INFO - Epoch [10][1850/7330] lr: 1.000e-05, eta: 4:37:51, time: 0.804, data_time: 0.053, memory: 18874, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0365, loss_cls: 0.1504, acc: 94.2617, loss_bbox: 0.2030, loss_mask: 0.2176, loss: 0.6185 2024-05-31 13:59:45,053 - mmdet - INFO - Epoch [10][1900/7330] lr: 1.000e-05, eta: 4:37:10, time: 0.823, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0387, loss_cls: 0.1477, acc: 94.3633, loss_bbox: 0.1967, loss_mask: 0.2067, loss: 0.6012 2024-05-31 14:00:25,682 - mmdet - INFO - Epoch [10][1950/7330] lr: 1.000e-05, eta: 4:36:28, time: 0.813, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0369, loss_cls: 0.1480, acc: 94.3079, loss_bbox: 0.2011, loss_mask: 0.2101, loss: 0.6073 2024-05-31 14:01:06,753 - mmdet - INFO - Epoch [10][2000/7330] lr: 1.000e-05, eta: 4:35:47, time: 0.821, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0376, loss_cls: 0.1521, acc: 94.0771, loss_bbox: 0.2041, loss_mask: 0.2146, loss: 0.6202 2024-05-31 14:01:48,362 - mmdet - INFO - Epoch [10][2050/7330] lr: 1.000e-05, eta: 4:35:05, time: 0.832, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0388, loss_cls: 0.1488, acc: 94.3320, loss_bbox: 0.2001, loss_mask: 0.2125, loss: 0.6124 2024-05-31 14:02:31,762 - mmdet - INFO - Epoch [10][2100/7330] lr: 1.000e-05, eta: 4:34:24, time: 0.868, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0392, loss_cls: 0.1539, acc: 94.1155, loss_bbox: 0.2017, loss_mask: 0.2178, loss: 0.6250 2024-05-31 14:03:12,822 - mmdet - INFO - Epoch [10][2150/7330] lr: 1.000e-05, eta: 4:33:43, time: 0.821, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0383, loss_cls: 0.1506, acc: 94.1553, loss_bbox: 0.2033, loss_mask: 0.2117, loss: 0.6154 2024-05-31 14:04:03,138 - mmdet - INFO - Epoch [10][2200/7330] lr: 1.000e-05, eta: 4:33:04, time: 1.006, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0393, loss_cls: 0.1586, acc: 94.0020, loss_bbox: 0.2086, loss_mask: 0.2165, loss: 0.6360 2024-05-31 14:04:46,299 - mmdet - INFO - Epoch [10][2250/7330] lr: 1.000e-05, eta: 4:32:23, time: 0.863, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0360, loss_cls: 0.1470, acc: 94.3472, loss_bbox: 0.1998, loss_mask: 0.2131, loss: 0.6073 2024-05-31 14:05:29,129 - mmdet - INFO - Epoch [10][2300/7330] lr: 1.000e-05, eta: 4:31:42, time: 0.857, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0373, loss_cls: 0.1467, acc: 94.3740, loss_bbox: 0.2012, loss_mask: 0.2109, loss: 0.6078 2024-05-31 14:06:10,125 - mmdet - INFO - Epoch [10][2350/7330] lr: 1.000e-05, eta: 4:31:01, time: 0.820, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0396, loss_cls: 0.1548, acc: 94.1765, loss_bbox: 0.2052, loss_mask: 0.2210, loss: 0.6336 2024-05-31 14:06:51,198 - mmdet - INFO - Epoch [10][2400/7330] lr: 1.000e-05, eta: 4:30:19, time: 0.822, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0363, loss_cls: 0.1481, acc: 94.2363, loss_bbox: 0.2011, loss_mask: 0.2124, loss: 0.6094 2024-05-31 14:07:33,131 - mmdet - INFO - Epoch [10][2450/7330] lr: 1.000e-05, eta: 4:29:38, time: 0.839, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0394, loss_cls: 0.1550, acc: 94.0789, loss_bbox: 0.2097, loss_mask: 0.2127, loss: 0.6293 2024-05-31 14:08:14,359 - mmdet - INFO - Epoch [10][2500/7330] lr: 1.000e-05, eta: 4:28:57, time: 0.825, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0372, loss_cls: 0.1476, acc: 94.2930, loss_bbox: 0.2023, loss_mask: 0.2144, loss: 0.6127 2024-05-31 14:08:55,391 - mmdet - INFO - Epoch [10][2550/7330] lr: 1.000e-05, eta: 4:28:15, time: 0.821, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0382, loss_cls: 0.1523, acc: 94.2598, loss_bbox: 0.2042, loss_mask: 0.2141, loss: 0.6211 2024-05-31 14:09:37,060 - mmdet - INFO - Epoch [10][2600/7330] lr: 1.000e-05, eta: 4:27:34, time: 0.833, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0378, loss_cls: 0.1523, acc: 94.1960, loss_bbox: 0.2067, loss_mask: 0.2132, loss: 0.6220 2024-05-31 14:10:21,457 - mmdet - INFO - Epoch [10][2650/7330] lr: 1.000e-05, eta: 4:26:53, time: 0.888, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0354, loss_cls: 0.1497, acc: 94.3142, loss_bbox: 0.1999, loss_mask: 0.2088, loss: 0.6061 2024-05-31 14:11:04,500 - mmdet - INFO - Epoch [10][2700/7330] lr: 1.000e-05, eta: 4:26:12, time: 0.861, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0374, loss_cls: 0.1520, acc: 94.1367, loss_bbox: 0.2052, loss_mask: 0.2170, loss: 0.6244 2024-05-31 14:11:45,471 - mmdet - INFO - Epoch [10][2750/7330] lr: 1.000e-05, eta: 4:25:31, time: 0.819, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0395, loss_cls: 0.1503, acc: 94.3064, loss_bbox: 0.2020, loss_mask: 0.2139, loss: 0.6186 2024-05-31 14:12:26,184 - mmdet - INFO - Epoch [10][2800/7330] lr: 1.000e-05, eta: 4:24:49, time: 0.814, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0369, loss_cls: 0.1483, acc: 94.4543, loss_bbox: 0.1991, loss_mask: 0.2137, loss: 0.6098 2024-05-31 14:13:07,071 - mmdet - INFO - Epoch [10][2850/7330] lr: 1.000e-05, eta: 4:24:08, time: 0.818, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0365, loss_cls: 0.1511, acc: 94.3467, loss_bbox: 0.1993, loss_mask: 0.2135, loss: 0.6124 2024-05-31 14:13:48,921 - mmdet - INFO - Epoch [10][2900/7330] lr: 1.000e-05, eta: 4:23:26, time: 0.837, data_time: 0.055, memory: 18874, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0388, loss_cls: 0.1491, acc: 94.1753, loss_bbox: 0.2096, loss_mask: 0.2204, loss: 0.6309 2024-05-31 14:14:29,674 - mmdet - INFO - Epoch [10][2950/7330] lr: 1.000e-05, eta: 4:22:45, time: 0.815, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0377, loss_cls: 0.1512, acc: 94.1787, loss_bbox: 0.2055, loss_mask: 0.2195, loss: 0.6267 2024-05-31 14:15:11,275 - mmdet - INFO - Epoch [10][3000/7330] lr: 1.000e-05, eta: 4:22:03, time: 0.832, data_time: 0.069, memory: 18874, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0387, loss_cls: 0.1539, acc: 94.0278, loss_bbox: 0.2112, loss_mask: 0.2155, loss: 0.6321 2024-05-31 14:15:52,694 - mmdet - INFO - Epoch [10][3050/7330] lr: 1.000e-05, eta: 4:21:22, time: 0.828, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0378, loss_cls: 0.1476, acc: 94.3684, loss_bbox: 0.1995, loss_mask: 0.2100, loss: 0.6070 2024-05-31 14:16:33,870 - mmdet - INFO - Epoch [10][3100/7330] lr: 1.000e-05, eta: 4:20:41, time: 0.824, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0387, loss_cls: 0.1477, acc: 94.3184, loss_bbox: 0.2026, loss_mask: 0.2178, loss: 0.6181 2024-05-31 14:17:18,857 - mmdet - INFO - Epoch [10][3150/7330] lr: 1.000e-05, eta: 4:20:00, time: 0.900, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0408, loss_cls: 0.1534, acc: 94.2124, loss_bbox: 0.2067, loss_mask: 0.2183, loss: 0.6311 2024-05-31 14:18:00,593 - mmdet - INFO - Epoch [10][3200/7330] lr: 1.000e-05, eta: 4:19:19, time: 0.835, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0401, loss_cls: 0.1542, acc: 94.0723, loss_bbox: 0.2067, loss_mask: 0.2116, loss: 0.6248 2024-05-31 14:18:49,083 - mmdet - INFO - Epoch [10][3250/7330] lr: 1.000e-05, eta: 4:18:39, time: 0.970, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0387, loss_cls: 0.1534, acc: 94.0598, loss_bbox: 0.2084, loss_mask: 0.2191, loss: 0.6319 2024-05-31 14:19:31,763 - mmdet - INFO - Epoch [10][3300/7330] lr: 1.000e-05, eta: 4:17:58, time: 0.854, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0358, loss_cls: 0.1378, acc: 94.7192, loss_bbox: 0.1887, loss_mask: 0.2048, loss: 0.5787 2024-05-31 14:20:14,991 - mmdet - INFO - Epoch [10][3350/7330] lr: 1.000e-05, eta: 4:17:17, time: 0.865, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0375, loss_cls: 0.1489, acc: 94.3296, loss_bbox: 0.1998, loss_mask: 0.2131, loss: 0.6117 2024-05-31 14:20:56,590 - mmdet - INFO - Epoch [10][3400/7330] lr: 1.000e-05, eta: 4:16:36, time: 0.832, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0380, loss_cls: 0.1506, acc: 94.2188, loss_bbox: 0.2040, loss_mask: 0.2140, loss: 0.6193 2024-05-31 14:21:37,516 - mmdet - INFO - Epoch [10][3450/7330] lr: 1.000e-05, eta: 4:15:54, time: 0.819, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0377, loss_cls: 0.1482, acc: 94.3408, loss_bbox: 0.2027, loss_mask: 0.2197, loss: 0.6209 2024-05-31 14:22:19,046 - mmdet - INFO - Epoch [10][3500/7330] lr: 1.000e-05, eta: 4:15:13, time: 0.831, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0399, loss_cls: 0.1581, acc: 93.8860, loss_bbox: 0.2146, loss_mask: 0.2188, loss: 0.6439 2024-05-31 14:22:59,938 - mmdet - INFO - Epoch [10][3550/7330] lr: 1.000e-05, eta: 4:14:32, time: 0.818, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0372, loss_cls: 0.1513, acc: 94.2632, loss_bbox: 0.2042, loss_mask: 0.2194, loss: 0.6246 2024-05-31 14:23:40,779 - mmdet - INFO - Epoch [10][3600/7330] lr: 1.000e-05, eta: 4:13:50, time: 0.817, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0392, loss_cls: 0.1554, acc: 94.0662, loss_bbox: 0.2042, loss_mask: 0.2142, loss: 0.6255 2024-05-31 14:24:22,132 - mmdet - INFO - Epoch [10][3650/7330] lr: 1.000e-05, eta: 4:13:09, time: 0.827, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0402, loss_cls: 0.1532, acc: 94.0991, loss_bbox: 0.2057, loss_mask: 0.2138, loss: 0.6262 2024-05-31 14:25:06,942 - mmdet - INFO - Epoch [10][3700/7330] lr: 1.000e-05, eta: 4:12:28, time: 0.896, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0356, loss_cls: 0.1454, acc: 94.4639, loss_bbox: 0.1971, loss_mask: 0.2119, loss: 0.6015 2024-05-31 14:25:50,557 - mmdet - INFO - Epoch [10][3750/7330] lr: 1.000e-05, eta: 4:11:47, time: 0.872, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0386, loss_cls: 0.1512, acc: 94.2012, loss_bbox: 0.2046, loss_mask: 0.2170, loss: 0.6237 2024-05-31 14:26:30,948 - mmdet - INFO - Epoch [10][3800/7330] lr: 1.000e-05, eta: 4:11:06, time: 0.808, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0374, loss_cls: 0.1473, acc: 94.3503, loss_bbox: 0.2003, loss_mask: 0.2131, loss: 0.6099 2024-05-31 14:27:11,242 - mmdet - INFO - Epoch [10][3850/7330] lr: 1.000e-05, eta: 4:10:24, time: 0.806, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0359, loss_cls: 0.1481, acc: 94.3682, loss_bbox: 0.1978, loss_mask: 0.2131, loss: 0.6066 2024-05-31 14:27:53,011 - mmdet - INFO - Epoch [10][3900/7330] lr: 1.000e-05, eta: 4:09:43, time: 0.835, data_time: 0.068, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0406, loss_cls: 0.1537, acc: 94.0801, loss_bbox: 0.2110, loss_mask: 0.2198, loss: 0.6380 2024-05-31 14:28:33,549 - mmdet - INFO - Epoch [10][3950/7330] lr: 1.000e-05, eta: 4:09:01, time: 0.811, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0373, loss_cls: 0.1443, acc: 94.4268, loss_bbox: 0.1992, loss_mask: 0.2089, loss: 0.6021 2024-05-31 14:29:14,454 - mmdet - INFO - Epoch [10][4000/7330] lr: 1.000e-05, eta: 4:08:19, time: 0.818, data_time: 0.059, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0368, loss_cls: 0.1512, acc: 94.1741, loss_bbox: 0.2046, loss_mask: 0.2164, loss: 0.6212 2024-05-31 14:29:56,096 - mmdet - INFO - Epoch [10][4050/7330] lr: 1.000e-05, eta: 4:07:38, time: 0.833, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0390, loss_cls: 0.1535, acc: 94.1108, loss_bbox: 0.2108, loss_mask: 0.2159, loss: 0.6320 2024-05-31 14:30:37,173 - mmdet - INFO - Epoch [10][4100/7330] lr: 1.000e-05, eta: 4:06:56, time: 0.822, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0378, loss_cls: 0.1502, acc: 94.2444, loss_bbox: 0.2012, loss_mask: 0.2141, loss: 0.6155 2024-05-31 14:31:18,108 - mmdet - INFO - Epoch [10][4150/7330] lr: 1.000e-05, eta: 4:06:15, time: 0.819, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0369, loss_cls: 0.1525, acc: 94.2461, loss_bbox: 0.2008, loss_mask: 0.2121, loss: 0.6156 2024-05-31 14:32:01,335 - mmdet - INFO - Epoch [10][4200/7330] lr: 1.000e-05, eta: 4:05:34, time: 0.865, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0377, loss_cls: 0.1447, acc: 94.4553, loss_bbox: 0.2002, loss_mask: 0.2128, loss: 0.6074 2024-05-31 14:32:41,619 - mmdet - INFO - Epoch [10][4250/7330] lr: 1.000e-05, eta: 4:04:52, time: 0.806, data_time: 0.038, memory: 18874, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0359, loss_cls: 0.1393, acc: 94.7253, loss_bbox: 0.1887, loss_mask: 0.2057, loss: 0.5809 2024-05-31 14:33:29,741 - mmdet - INFO - Epoch [10][4300/7330] lr: 1.000e-05, eta: 4:04:13, time: 0.963, data_time: 0.071, memory: 18874, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0399, loss_cls: 0.1535, acc: 94.0576, loss_bbox: 0.2046, loss_mask: 0.2120, loss: 0.6232 2024-05-31 14:34:15,856 - mmdet - INFO - Epoch [10][4350/7330] lr: 1.000e-05, eta: 4:03:32, time: 0.922, data_time: 0.066, memory: 18874, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0386, loss_cls: 0.1527, acc: 94.1648, loss_bbox: 0.2076, loss_mask: 0.2163, loss: 0.6278 2024-05-31 14:34:56,876 - mmdet - INFO - Epoch [10][4400/7330] lr: 1.000e-05, eta: 4:02:51, time: 0.820, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0381, loss_cls: 0.1489, acc: 94.3015, loss_bbox: 0.2026, loss_mask: 0.2157, loss: 0.6175 2024-05-31 14:35:37,987 - mmdet - INFO - Epoch [10][4450/7330] lr: 1.000e-05, eta: 4:02:09, time: 0.822, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0360, loss_cls: 0.1506, acc: 94.1846, loss_bbox: 0.2042, loss_mask: 0.2156, loss: 0.6184 2024-05-31 14:36:19,070 - mmdet - INFO - Epoch [10][4500/7330] lr: 1.000e-05, eta: 4:01:28, time: 0.822, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0393, loss_cls: 0.1500, acc: 94.2593, loss_bbox: 0.2010, loss_mask: 0.2105, loss: 0.6130 2024-05-31 14:36:59,858 - mmdet - INFO - Epoch [10][4550/7330] lr: 1.000e-05, eta: 4:00:46, time: 0.816, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0359, loss_cls: 0.1449, acc: 94.4690, loss_bbox: 0.1983, loss_mask: 0.2132, loss: 0.6043 2024-05-31 14:37:40,861 - mmdet - INFO - Epoch [10][4600/7330] lr: 1.000e-05, eta: 4:00:05, time: 0.820, data_time: 0.048, memory: 18874, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0374, loss_cls: 0.1522, acc: 94.1904, loss_bbox: 0.2044, loss_mask: 0.2142, loss: 0.6209 2024-05-31 14:38:21,639 - mmdet - INFO - Epoch [10][4650/7330] lr: 1.000e-05, eta: 3:59:23, time: 0.816, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0354, loss_cls: 0.1426, acc: 94.4517, loss_bbox: 0.1977, loss_mask: 0.2150, loss: 0.6017 2024-05-31 14:39:02,212 - mmdet - INFO - Epoch [10][4700/7330] lr: 1.000e-05, eta: 3:58:42, time: 0.811, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0383, loss_cls: 0.1499, acc: 94.1558, loss_bbox: 0.2065, loss_mask: 0.2106, loss: 0.6169 2024-05-31 14:39:48,567 - mmdet - INFO - Epoch [10][4750/7330] lr: 1.000e-05, eta: 3:58:01, time: 0.927, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0378, loss_cls: 0.1566, acc: 94.0347, loss_bbox: 0.2085, loss_mask: 0.2191, loss: 0.6344 2024-05-31 14:40:29,514 - mmdet - INFO - Epoch [10][4800/7330] lr: 1.000e-05, eta: 3:57:20, time: 0.819, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0357, loss_cls: 0.1433, acc: 94.3943, loss_bbox: 0.1935, loss_mask: 0.2036, loss: 0.5879 2024-05-31 14:41:10,161 - mmdet - INFO - Epoch [10][4850/7330] lr: 1.000e-05, eta: 3:56:38, time: 0.813, data_time: 0.043, memory: 18874, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0355, loss_cls: 0.1442, acc: 94.3752, loss_bbox: 0.1966, loss_mask: 0.2091, loss: 0.5975 2024-05-31 14:41:50,879 - mmdet - INFO - Epoch [10][4900/7330] lr: 1.000e-05, eta: 3:55:57, time: 0.814, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0390, loss_cls: 0.1474, acc: 94.3823, loss_bbox: 0.1994, loss_mask: 0.2136, loss: 0.6132 2024-05-31 14:42:31,034 - mmdet - INFO - Epoch [10][4950/7330] lr: 1.000e-05, eta: 3:55:15, time: 0.803, data_time: 0.051, memory: 18874, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0370, loss_cls: 0.1452, acc: 94.4451, loss_bbox: 0.1931, loss_mask: 0.2117, loss: 0.5991 2024-05-31 14:43:11,802 - mmdet - INFO - Epoch [10][5000/7330] lr: 1.000e-05, eta: 3:54:33, time: 0.815, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0362, loss_cls: 0.1442, acc: 94.4810, loss_bbox: 0.1923, loss_mask: 0.2069, loss: 0.5908 2024-05-31 14:43:52,456 - mmdet - INFO - Epoch [10][5050/7330] lr: 1.000e-05, eta: 3:53:52, time: 0.813, data_time: 0.047, memory: 18874, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0371, loss_cls: 0.1472, acc: 94.2490, loss_bbox: 0.2014, loss_mask: 0.2140, loss: 0.6117 2024-05-31 14:44:34,137 - mmdet - INFO - Epoch [10][5100/7330] lr: 1.000e-05, eta: 3:53:10, time: 0.834, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0403, loss_cls: 0.1588, acc: 93.8899, loss_bbox: 0.2142, loss_mask: 0.2184, loss: 0.6453 2024-05-31 14:45:15,472 - mmdet - INFO - Epoch [10][5150/7330] lr: 1.000e-05, eta: 3:52:29, time: 0.827, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0387, loss_cls: 0.1517, acc: 94.2012, loss_bbox: 0.2054, loss_mask: 0.2119, loss: 0.6202 2024-05-31 14:45:55,757 - mmdet - INFO - Epoch [10][5200/7330] lr: 1.000e-05, eta: 3:51:47, time: 0.806, data_time: 0.054, memory: 18874, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0367, loss_cls: 0.1446, acc: 94.4993, loss_bbox: 0.1941, loss_mask: 0.2068, loss: 0.5940 2024-05-31 14:46:39,747 - mmdet - INFO - Epoch [10][5250/7330] lr: 1.000e-05, eta: 3:51:06, time: 0.880, data_time: 0.063, memory: 18874, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0398, loss_cls: 0.1593, acc: 93.8608, loss_bbox: 0.2098, loss_mask: 0.2169, loss: 0.6391 2024-05-31 14:47:23,649 - mmdet - INFO - Epoch [10][5300/7330] lr: 1.000e-05, eta: 3:50:26, time: 0.878, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0394, loss_cls: 0.1529, acc: 94.1731, loss_bbox: 0.2069, loss_mask: 0.2126, loss: 0.6250 2024-05-31 14:48:08,837 - mmdet - INFO - Epoch [10][5350/7330] lr: 1.000e-05, eta: 3:49:45, time: 0.904, data_time: 0.044, memory: 18874, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0365, loss_cls: 0.1480, acc: 94.3613, loss_bbox: 0.2000, loss_mask: 0.2142, loss: 0.6113 2024-05-31 14:48:54,080 - mmdet - INFO - Epoch [10][5400/7330] lr: 1.000e-05, eta: 3:49:04, time: 0.905, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0363, loss_cls: 0.1432, acc: 94.5332, loss_bbox: 0.1945, loss_mask: 0.2092, loss: 0.5948 2024-05-31 14:49:34,377 - mmdet - INFO - Epoch [10][5450/7330] lr: 1.000e-05, eta: 3:48:23, time: 0.806, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0344, loss_cls: 0.1368, acc: 94.7412, loss_bbox: 0.1881, loss_mask: 0.2068, loss: 0.5773 2024-05-31 14:50:14,984 - mmdet - INFO - Epoch [10][5500/7330] lr: 1.000e-05, eta: 3:47:41, time: 0.812, data_time: 0.049, memory: 18874, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0375, loss_cls: 0.1502, acc: 94.2659, loss_bbox: 0.2038, loss_mask: 0.2155, loss: 0.6185 2024-05-31 14:50:55,247 - mmdet - INFO - Epoch [10][5550/7330] lr: 1.000e-05, eta: 3:47:00, time: 0.805, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0345, loss_cls: 0.1403, acc: 94.5527, loss_bbox: 0.1936, loss_mask: 0.2084, loss: 0.5872 2024-05-31 14:51:36,520 - mmdet - INFO - Epoch [10][5600/7330] lr: 1.000e-05, eta: 3:46:18, time: 0.826, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0397, loss_cls: 0.1596, acc: 93.8586, loss_bbox: 0.2148, loss_mask: 0.2214, loss: 0.6484 2024-05-31 14:52:17,151 - mmdet - INFO - Epoch [10][5650/7330] lr: 1.000e-05, eta: 3:45:36, time: 0.813, data_time: 0.052, memory: 18874, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0355, loss_cls: 0.1454, acc: 94.4280, loss_bbox: 0.1953, loss_mask: 0.2077, loss: 0.5957 2024-05-31 14:52:58,182 - mmdet - INFO - Epoch [10][5700/7330] lr: 1.000e-05, eta: 3:44:55, time: 0.821, data_time: 0.056, memory: 18874, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0396, loss_cls: 0.1519, acc: 94.1165, loss_bbox: 0.2068, loss_mask: 0.2125, loss: 0.6240 2024-05-31 14:53:39,982 - mmdet - INFO - Epoch [10][5750/7330] lr: 1.000e-05, eta: 3:44:14, time: 0.836, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0372, loss_cls: 0.1537, acc: 94.1450, loss_bbox: 0.2085, loss_mask: 0.2105, loss: 0.6222 2024-05-31 14:54:26,138 - mmdet - INFO - Epoch [10][5800/7330] lr: 1.000e-05, eta: 3:43:33, time: 0.923, data_time: 0.062, memory: 18874, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0388, loss_cls: 0.1585, acc: 93.9268, loss_bbox: 0.2103, loss_mask: 0.2148, loss: 0.6352 2024-05-31 14:55:06,327 - mmdet - INFO - Epoch [10][5850/7330] lr: 1.000e-05, eta: 3:42:52, time: 0.804, data_time: 0.057, memory: 18874, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0357, loss_cls: 0.1439, acc: 94.4985, loss_bbox: 0.1951, loss_mask: 0.2088, loss: 0.5946 2024-05-31 14:55:46,923 - mmdet - INFO - Epoch [10][5900/7330] lr: 1.000e-05, eta: 3:42:10, time: 0.812, data_time: 0.060, memory: 18874, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0379, loss_cls: 0.1444, acc: 94.5232, loss_bbox: 0.1990, loss_mask: 0.2101, loss: 0.6033 2024-05-31 14:56:27,718 - mmdet - INFO - Epoch [10][5950/7330] lr: 1.000e-05, eta: 3:41:28, time: 0.816, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0350, loss_cls: 0.1435, acc: 94.5190, loss_bbox: 0.1975, loss_mask: 0.2127, loss: 0.6000 2024-05-31 14:57:08,914 - mmdet - INFO - Epoch [10][6000/7330] lr: 1.000e-05, eta: 3:40:47, time: 0.824, data_time: 0.065, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0390, loss_cls: 0.1481, acc: 94.3726, loss_bbox: 0.1961, loss_mask: 0.2106, loss: 0.6058 2024-05-31 14:57:49,976 - mmdet - INFO - Epoch [10][6050/7330] lr: 1.000e-05, eta: 3:40:05, time: 0.821, data_time: 0.061, memory: 18874, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0356, loss_cls: 0.1493, acc: 94.2578, loss_bbox: 0.2034, loss_mask: 0.2100, loss: 0.6099 2024-05-31 14:58:31,460 - mmdet - INFO - Epoch [10][6100/7330] lr: 1.000e-05, eta: 3:39:24, time: 0.830, data_time: 0.046, memory: 18874, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0374, loss_cls: 0.1467, acc: 94.3855, loss_bbox: 0.1984, loss_mask: 0.2121, loss: 0.6066 2024-05-31 14:59:12,238 - mmdet - INFO - Epoch [10][6150/7330] lr: 1.000e-05, eta: 3:38:42, time: 0.816, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0371, loss_cls: 0.1561, acc: 94.0530, loss_bbox: 0.2077, loss_mask: 0.2145, loss: 0.6282 2024-05-31 14:59:53,878 - mmdet - INFO - Epoch [10][6200/7330] lr: 1.000e-05, eta: 3:38:01, time: 0.833, data_time: 0.064, memory: 18874, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0398, loss_cls: 0.1521, acc: 94.1294, loss_bbox: 0.2067, loss_mask: 0.2143, loss: 0.6247 2024-05-31 15:00:37,373 - mmdet - INFO - Epoch [10][6250/7330] lr: 1.000e-05, eta: 3:37:20, time: 0.870, data_time: 0.045, memory: 18874, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0411, loss_cls: 0.1559, acc: 94.0598, loss_bbox: 0.2129, loss_mask: 0.2184, loss: 0.6417 2024-05-31 15:01:18,855 - mmdet - INFO - Epoch [10][6300/7330] lr: 1.000e-05, eta: 3:36:39, time: 0.830, data_time: 0.050, memory: 18874, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0393, loss_cls: 0.1538, acc: 94.1069, loss_bbox: 0.2065, loss_mask: 0.2179, loss: 0.6311 2024-05-31 15:02:05,612 - mmdet - INFO - Epoch [10][6350/7330] lr: 1.000e-05, eta: 3:35:58, time: 0.935, data_time: 0.070, memory: 18874, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0377, loss_cls: 0.1555, acc: 94.1292, loss_bbox: 0.2068, loss_mask: 0.2146, loss: 0.6268 2024-05-31 15:02:50,002 - mmdet - INFO - Epoch [10][6400/7330] lr: 1.000e-05, eta: 3:35:18, time: 0.888, data_time: 0.058, memory: 18874, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0366, loss_cls: 0.1492, acc: 94.2942, loss_bbox: 0.1987, loss_mask: 0.2097, loss: 0.6062