2024-05-28 09:20:31,860 - mmdet - INFO - Environment info: ------------------------------------------------------------ sys.platform: linux Python: 3.9.19 (main, May 6 2024, 19:43:03) [GCC 11.2.0] CUDA available: True GPU 0,1,2,3,4,5,6,7: NVIDIA A100-SXM4-80GB CUDA_HOME: /mnt/petrelfs/share/cuda-11.7/ NVCC: Cuda compilation tools, release 11.7, V11.7.99 GCC: gcc (GCC) 7.3.0 PyTorch: 1.12.0+cu113 PyTorch compiling details: PyTorch built with: - GCC 9.3 - C++ Version: 201402 - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v2.6.0 (Git Hash 52b5f107dd9cf10910aaa19cb47f3abf9b349815) - OpenMP 201511 (a.k.a. OpenMP 4.5) - LAPACK is enabled (usually provided by MKL) - NNPACK is enabled - CPU capability usage: AVX2 - CUDA Runtime 11.3 - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86 - CuDNN 8.3.2 (built against CUDA 11.5) - Magma 2.5.2 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.3, CUDNN_VERSION=8.3.2, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.12.0, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=OFF, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, TorchVision: 0.13.0+cu113 OpenCV: 4.9.0 MMCV: 1.7.0 MMCV Compiler: GCC 7.3 MMCV CUDA Compiler: 11.7 MMDetection: 2.25.3+da84357 ------------------------------------------------------------ 2024-05-28 09:20:33,383 - mmdet - INFO - Distributed training: True 2024-05-28 09:20:34,953 - mmdet - INFO - Config: model = dict( type='MaskRCNN', backbone=dict( type='PIIPThreeBranch', n_points=4, deform_num_heads=16, cffn_ratio=0.25, deform_ratio=0.5, with_cffn=True, interact_attn_type='deform', interaction_drop_path_rate=0.4, branch1=dict( real_size=672, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=24, embed_dim=1024, num_heads=16, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.4, init_scale=1.0, with_fpn=False, interaction_indexes=[[0, 1], [2, 3], [4, 5], [6, 7], [8, 9], [10, 11], [12, 13], [14, 15], [16, 17], [18, 19], [20, 21], [22, 23]], pretrained='./pretrained/deit_3_large_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[ 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28 ], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch2=dict( real_size=1120, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=768, num_heads=12, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.15, init_scale=1.0, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_3_base_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch3=dict( real_size=1568, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=384, num_heads=6, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.05, init_scale=1.0, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_3_small_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True))), neck=dict( type='FPN', in_channels=[1024, 1024, 1024, 1024], out_channels=256, num_outs=5), rpn_head=dict( type='RPNHead', in_channels=256, feat_channels=256, anchor_generator=dict( type='AnchorGenerator', scales=[8], ratios=[0.5, 1.0, 2.0], strides=[4, 8, 16, 32, 64]), bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[1.0, 1.0, 1.0, 1.0]), loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), roi_head=dict( type='StandardRoIHead', bbox_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=7, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), bbox_head=dict( type='Shared2FCBBoxHead', in_channels=256, fc_out_channels=1024, roi_feat_size=7, num_classes=80, bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[0.1, 0.1, 0.2, 0.2]), reg_class_agnostic=False, loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), mask_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=14, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), mask_head=dict( type='FCNMaskHead', num_convs=4, in_channels=256, conv_out_channels=256, num_classes=80, loss_mask=dict( type='CrossEntropyLoss', use_mask=True, loss_weight=1.0))), train_cfg=dict( rpn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.7, neg_iou_thr=0.3, min_pos_iou=0.3, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=256, pos_fraction=0.5, neg_pos_ub=-1, add_gt_as_proposals=False), allowed_border=-1, pos_weight=-1, debug=False), rpn_proposal=dict( nms_pre=2000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.5, neg_iou_thr=0.5, min_pos_iou=0.5, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=512, pos_fraction=0.25, neg_pos_ub=-1, add_gt_as_proposals=True), mask_size=28, pos_weight=-1, debug=False)), test_cfg=dict( rpn=dict( nms_pre=1000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( score_thr=0.05, nms=dict(type='nms', iou_threshold=0.5), max_per_img=100, mask_thr_binary=0.5))) dataset_type = 'CocoDataset' data_root = 'data/coco/' img_norm_cfg = dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True) train_pipeline = [ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1568, 941), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ] test_pipeline = [ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1568, 941), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ] data = dict( samples_per_gpu=2, workers_per_gpu=2, train=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_train2017.json', img_prefix='data/coco/train2017/', pipeline=[ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1568, 941), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='DefaultFormatBundle'), dict( type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ]), val=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1568, 941), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ]), test=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1568, 941), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ])) evaluation = dict(metric=['bbox', 'segm'], interval=1, save_best=None) optimizer = dict( type='AdamW', lr=0.0001, betas=(0.9, 0.999), weight_decay=0.05, constructor='CustomLayerDecayOptimizerConstructorMMDet', paramwise_cfg=dict( num_layers=24, layer_decay_rate=0.85, skip_stride=[2, 2])) optimizer_config = dict(grad_clip=None) lr_config = dict( policy='step', warmup='linear', warmup_iters=500, warmup_ratio=0.001, step=[8, 11]) runner = dict(type='EpochBasedRunner', max_epochs=12) checkpoint_config = dict(interval=1, deepspeed=True, max_keep_ckpts=1) log_config = dict(interval=50, hooks=[dict(type='TextLoggerHook')]) custom_hooks = [dict(type='ToBFloat16HookMMDet', priority=49)] dist_params = dict(backend='nccl') log_level = 'INFO' load_from = None resume_from = None workflow = [('train', 1)] opencv_num_threads = 0 mp_start_method = 'fork' auto_scale_lr = dict(enable=False, base_batch_size=16) deepspeed = True deepspeed_config = 'zero_configs/adam_zero1_bf16.json' custom_imports = dict( imports=['mmdet.mmcv_custom'], allow_failed_imports=False) work_dir = './work_dirs/mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16' auto_resume = True gpu_ids = range(0, 8) 2024-05-28 09:20:39,823 - mmdet - INFO - Set random seed to 931828675, deterministic: False 2024-05-28 09:20:47,275 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-28 09:20:49,914 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-28 09:20:51,362 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-28 09:22:07,555 - mmdet - INFO - initialize FPN with init_cfg {'type': 'Xavier', 'layer': 'Conv2d', 'distribution': 'uniform'} 2024-05-28 09:22:07,789 - mmdet - INFO - initialize RPNHead with init_cfg {'type': 'Normal', 'layer': 'Conv2d', 'std': 0.01} 2024-05-28 09:22:07,810 - mmdet - INFO - initialize Shared2FCBBoxHead with init_cfg [{'type': 'Normal', 'std': 0.01, 'override': {'name': 'fc_cls'}}, {'type': 'Normal', 'std': 0.001, 'override': {'name': 'fc_reg'}}, {'type': 'Xavier', 'distribution': 'uniform', 'override': [{'name': 'shared_fcs'}, {'name': 'cls_fcs'}, {'name': 'reg_fcs'}]}] Name of parameter - Initialization information backbone.w1 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w2 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w3 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.pos_embed - torch.Size([1, 196, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.patch_embed.proj.weight - torch.Size([1024, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.patch_embed.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.pos_embed - torch.Size([1, 196, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.patch_embed.proj.weight - torch.Size([768, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.patch_embed.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.pos_embed - torch.Size([1, 196, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.patch_embed.proj.weight - torch.Size([384, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.patch_embed.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.0.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.0.weight - torch.Size([1024, 768, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.0.weight - torch.Size([1024, 384, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.0.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.3.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.3.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn2.0.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn2.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.0.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.1.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.2.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.3.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.0.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.1.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.2.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.3.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN rpn_head.rpn_conv.weight - torch.Size([256, 256, 3, 3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_conv.bias - torch.Size([256]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.weight - torch.Size([3, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.bias - torch.Size([3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.weight - torch.Size([12, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.bias - torch.Size([12]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_cls.weight - torch.Size([81, 1024]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_cls.bias - torch.Size([81]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_reg.weight - torch.Size([320, 1024]): NormalInit: mean=0, std=0.001, bias=0 roi_head.bbox_head.fc_reg.bias - torch.Size([320]): NormalInit: mean=0, std=0.001, bias=0 roi_head.bbox_head.shared_fcs.0.weight - torch.Size([1024, 12544]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.0.bias - torch.Size([1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.1.weight - torch.Size([1024, 1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.1.bias - torch.Size([1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.mask_head.convs.0.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.1.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.2.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.3.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.upsample.weight - torch.Size([256, 256, 2, 2]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.upsample.bias - torch.Size([256]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.conv_logits.weight - torch.Size([80, 256, 1, 1]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.conv_logits.bias - torch.Size([80]): Initialized by user-defined `init_weights` in FCNMaskHead 2024-05-28 09:22:22,419 - mmdet - INFO - {'num_layers': 24, 'layer_decay_rate': 0.85, 'skip_stride': [2, 2]} 2024-05-28 09:22:22,419 - mmdet - INFO - Build LayerDecayOptimizerConstructor 0.850000 - 26 2024-05-28 09:22:22,432 - mmdet - INFO - Param groups = { "layer_25_decay": { "param_names": [ "backbone.w1", "backbone.w2", "backbone.w3", "backbone.interactions.0.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.merge_branch1.0.weight", "backbone.merge_branch1.3.weight", "backbone.merge_branch2.0.weight", "backbone.merge_branch2.3.weight", "backbone.merge_branch3.0.weight", "backbone.merge_branch3.3.weight", "backbone.fpn1.0.weight", "backbone.fpn1.3.weight", "backbone.fpn2.0.weight", "neck.lateral_convs.0.conv.weight", "neck.lateral_convs.1.conv.weight", "neck.lateral_convs.2.conv.weight", "neck.lateral_convs.3.conv.weight", "neck.fpn_convs.0.conv.weight", "neck.fpn_convs.1.conv.weight", "neck.fpn_convs.2.conv.weight", "neck.fpn_convs.3.conv.weight", "rpn_head.rpn_conv.weight", "rpn_head.rpn_cls.weight", "rpn_head.rpn_reg.weight", "roi_head.bbox_head.fc_cls.weight", "roi_head.bbox_head.fc_reg.weight", "roi_head.bbox_head.shared_fcs.0.weight", "roi_head.bbox_head.shared_fcs.1.weight", "roi_head.mask_head.convs.0.conv.weight", "roi_head.mask_head.convs.1.conv.weight", "roi_head.mask_head.convs.2.conv.weight", "roi_head.mask_head.convs.3.conv.weight", "roi_head.mask_head.upsample.weight", "roi_head.mask_head.conv_logits.weight" ], "lr_scale": 1.0, "lr": 0.0001, "weight_decay": 0.05 }, "layer_0_decay": { "param_names": [ "backbone.branch1.pos_embed", "backbone.branch1.patch_embed.proj.weight", "backbone.branch2.pos_embed", "backbone.branch2.patch_embed.proj.weight", "backbone.branch3.pos_embed", "backbone.branch3.patch_embed.proj.weight" ], "lr_scale": 0.017197809852207896, "lr": 1.7197809852207897e-06, "weight_decay": 0.05 }, "layer_0_no_decay": { "param_names": [ "backbone.branch1.patch_embed.proj.bias", "backbone.branch2.patch_embed.proj.bias", "backbone.branch3.patch_embed.proj.bias" ], "lr_scale": 0.017197809852207896, "lr": 1.7197809852207897e-06, "weight_decay": 0.0 }, "layer_1_no_decay": { "param_names": [ "backbone.branch1.blocks.0.gamma_1", "backbone.branch1.blocks.0.gamma_2", "backbone.branch1.blocks.0.norm1.weight", "backbone.branch1.blocks.0.norm1.bias", "backbone.branch1.blocks.0.attn.qkv.bias", "backbone.branch1.blocks.0.attn.proj.bias", "backbone.branch1.blocks.0.norm2.weight", "backbone.branch1.blocks.0.norm2.bias", "backbone.branch1.blocks.0.mlp.fc1.bias", "backbone.branch1.blocks.0.mlp.fc2.bias" ], "lr_scale": 0.02023271747318576, "lr": 2.023271747318576e-06, "weight_decay": 0.0 }, "layer_1_decay": { "param_names": [ "backbone.branch1.blocks.0.attn.qkv.weight", "backbone.branch1.blocks.0.attn.proj.weight", "backbone.branch1.blocks.0.mlp.fc1.weight", "backbone.branch1.blocks.0.mlp.fc2.weight" ], "lr_scale": 0.02023271747318576, "lr": 2.023271747318576e-06, "weight_decay": 0.05 }, "layer_2_no_decay": { "param_names": [ "backbone.branch1.blocks.1.gamma_1", "backbone.branch1.blocks.1.gamma_2", "backbone.branch1.blocks.1.norm1.weight", "backbone.branch1.blocks.1.norm1.bias", "backbone.branch1.blocks.1.attn.qkv.bias", "backbone.branch1.blocks.1.attn.proj.bias", "backbone.branch1.blocks.1.norm2.weight", "backbone.branch1.blocks.1.norm2.bias", "backbone.branch1.blocks.1.mlp.fc1.bias", "backbone.branch1.blocks.1.mlp.fc2.bias", "backbone.branch2.blocks.0.gamma_1", "backbone.branch2.blocks.0.gamma_2", "backbone.branch2.blocks.0.norm1.weight", "backbone.branch2.blocks.0.norm1.bias", "backbone.branch2.blocks.0.attn.qkv.bias", "backbone.branch2.blocks.0.attn.proj.bias", "backbone.branch2.blocks.0.norm2.weight", "backbone.branch2.blocks.0.norm2.bias", "backbone.branch2.blocks.0.mlp.fc1.bias", "backbone.branch2.blocks.0.mlp.fc2.bias", "backbone.branch3.blocks.0.gamma_1", "backbone.branch3.blocks.0.gamma_2", "backbone.branch3.blocks.0.norm1.weight", "backbone.branch3.blocks.0.norm1.bias", "backbone.branch3.blocks.0.attn.qkv.bias", "backbone.branch3.blocks.0.attn.proj.bias", "backbone.branch3.blocks.0.norm2.weight", "backbone.branch3.blocks.0.norm2.bias", "backbone.branch3.blocks.0.mlp.fc1.bias", "backbone.branch3.blocks.0.mlp.fc2.bias" ], "lr_scale": 0.023803197027277366, "lr": 2.380319702727737e-06, "weight_decay": 0.0 }, "layer_2_decay": { "param_names": [ "backbone.branch1.blocks.1.attn.qkv.weight", "backbone.branch1.blocks.1.attn.proj.weight", "backbone.branch1.blocks.1.mlp.fc1.weight", "backbone.branch1.blocks.1.mlp.fc2.weight", "backbone.branch2.blocks.0.attn.qkv.weight", "backbone.branch2.blocks.0.attn.proj.weight", "backbone.branch2.blocks.0.mlp.fc1.weight", "backbone.branch2.blocks.0.mlp.fc2.weight", "backbone.branch3.blocks.0.attn.qkv.weight", "backbone.branch3.blocks.0.attn.proj.weight", "backbone.branch3.blocks.0.mlp.fc1.weight", "backbone.branch3.blocks.0.mlp.fc2.weight" ], "lr_scale": 0.023803197027277366, "lr": 2.380319702727737e-06, "weight_decay": 0.05 }, "layer_3_no_decay": { "param_names": [ "backbone.branch1.blocks.2.gamma_1", "backbone.branch1.blocks.2.gamma_2", "backbone.branch1.blocks.2.norm1.weight", "backbone.branch1.blocks.2.norm1.bias", "backbone.branch1.blocks.2.attn.qkv.bias", "backbone.branch1.blocks.2.attn.proj.bias", "backbone.branch1.blocks.2.norm2.weight", "backbone.branch1.blocks.2.norm2.bias", "backbone.branch1.blocks.2.mlp.fc1.bias", "backbone.branch1.blocks.2.mlp.fc2.bias" ], "lr_scale": 0.028003761208561607, "lr": 2.8003761208561607e-06, "weight_decay": 0.0 }, "layer_3_decay": { "param_names": [ "backbone.branch1.blocks.2.attn.qkv.weight", "backbone.branch1.blocks.2.attn.proj.weight", "backbone.branch1.blocks.2.mlp.fc1.weight", "backbone.branch1.blocks.2.mlp.fc2.weight" ], "lr_scale": 0.028003761208561607, "lr": 2.8003761208561607e-06, "weight_decay": 0.05 }, "layer_4_no_decay": { "param_names": [ "backbone.branch1.blocks.3.gamma_1", "backbone.branch1.blocks.3.gamma_2", "backbone.branch1.blocks.3.norm1.weight", "backbone.branch1.blocks.3.norm1.bias", "backbone.branch1.blocks.3.attn.qkv.bias", "backbone.branch1.blocks.3.attn.proj.bias", "backbone.branch1.blocks.3.norm2.weight", "backbone.branch1.blocks.3.norm2.bias", "backbone.branch1.blocks.3.mlp.fc1.bias", "backbone.branch1.blocks.3.mlp.fc2.bias", "backbone.branch2.blocks.1.gamma_1", "backbone.branch2.blocks.1.gamma_2", "backbone.branch2.blocks.1.norm1.weight", "backbone.branch2.blocks.1.norm1.bias", "backbone.branch2.blocks.1.attn.qkv.bias", "backbone.branch2.blocks.1.attn.proj.bias", "backbone.branch2.blocks.1.norm2.weight", "backbone.branch2.blocks.1.norm2.bias", "backbone.branch2.blocks.1.mlp.fc1.bias", "backbone.branch2.blocks.1.mlp.fc2.bias", "backbone.branch3.blocks.1.gamma_1", "backbone.branch3.blocks.1.gamma_2", "backbone.branch3.blocks.1.norm1.weight", "backbone.branch3.blocks.1.norm1.bias", "backbone.branch3.blocks.1.attn.qkv.bias", "backbone.branch3.blocks.1.attn.proj.bias", "backbone.branch3.blocks.1.norm2.weight", "backbone.branch3.blocks.1.norm2.bias", "backbone.branch3.blocks.1.mlp.fc1.bias", "backbone.branch3.blocks.1.mlp.fc2.bias" ], "lr_scale": 0.03294560142183718, "lr": 3.2945601421837183e-06, "weight_decay": 0.0 }, "layer_4_decay": { "param_names": [ "backbone.branch1.blocks.3.attn.qkv.weight", "backbone.branch1.blocks.3.attn.proj.weight", "backbone.branch1.blocks.3.mlp.fc1.weight", "backbone.branch1.blocks.3.mlp.fc2.weight", "backbone.branch2.blocks.1.attn.qkv.weight", "backbone.branch2.blocks.1.attn.proj.weight", "backbone.branch2.blocks.1.mlp.fc1.weight", "backbone.branch2.blocks.1.mlp.fc2.weight", "backbone.branch3.blocks.1.attn.qkv.weight", "backbone.branch3.blocks.1.attn.proj.weight", "backbone.branch3.blocks.1.mlp.fc1.weight", "backbone.branch3.blocks.1.mlp.fc2.weight" ], "lr_scale": 0.03294560142183718, "lr": 3.2945601421837183e-06, "weight_decay": 0.05 }, "layer_5_no_decay": { "param_names": [ "backbone.branch1.blocks.4.gamma_1", "backbone.branch1.blocks.4.gamma_2", "backbone.branch1.blocks.4.norm1.weight", "backbone.branch1.blocks.4.norm1.bias", "backbone.branch1.blocks.4.attn.qkv.bias", "backbone.branch1.blocks.4.attn.proj.bias", "backbone.branch1.blocks.4.norm2.weight", "backbone.branch1.blocks.4.norm2.bias", "backbone.branch1.blocks.4.mlp.fc1.bias", "backbone.branch1.blocks.4.mlp.fc2.bias" ], "lr_scale": 0.03875953108451433, "lr": 3.875953108451433e-06, "weight_decay": 0.0 }, "layer_5_decay": { "param_names": [ "backbone.branch1.blocks.4.attn.qkv.weight", "backbone.branch1.blocks.4.attn.proj.weight", "backbone.branch1.blocks.4.mlp.fc1.weight", "backbone.branch1.blocks.4.mlp.fc2.weight" ], "lr_scale": 0.03875953108451433, "lr": 3.875953108451433e-06, "weight_decay": 0.05 }, "layer_6_no_decay": { "param_names": [ "backbone.branch1.blocks.5.gamma_1", "backbone.branch1.blocks.5.gamma_2", "backbone.branch1.blocks.5.norm1.weight", "backbone.branch1.blocks.5.norm1.bias", "backbone.branch1.blocks.5.attn.qkv.bias", "backbone.branch1.blocks.5.attn.proj.bias", "backbone.branch1.blocks.5.norm2.weight", "backbone.branch1.blocks.5.norm2.bias", "backbone.branch1.blocks.5.mlp.fc1.bias", "backbone.branch1.blocks.5.mlp.fc2.bias", "backbone.branch2.blocks.2.gamma_1", "backbone.branch2.blocks.2.gamma_2", "backbone.branch2.blocks.2.norm1.weight", "backbone.branch2.blocks.2.norm1.bias", "backbone.branch2.blocks.2.attn.qkv.bias", "backbone.branch2.blocks.2.attn.proj.bias", "backbone.branch2.blocks.2.norm2.weight", "backbone.branch2.blocks.2.norm2.bias", "backbone.branch2.blocks.2.mlp.fc1.bias", "backbone.branch2.blocks.2.mlp.fc2.bias", "backbone.branch3.blocks.2.gamma_1", "backbone.branch3.blocks.2.gamma_2", "backbone.branch3.blocks.2.norm1.weight", "backbone.branch3.blocks.2.norm1.bias", "backbone.branch3.blocks.2.attn.qkv.bias", "backbone.branch3.blocks.2.attn.proj.bias", "backbone.branch3.blocks.2.norm2.weight", "backbone.branch3.blocks.2.norm2.bias", "backbone.branch3.blocks.2.mlp.fc1.bias", "backbone.branch3.blocks.2.mlp.fc2.bias" ], "lr_scale": 0.04559944833472275, "lr": 4.5599448334722756e-06, "weight_decay": 0.0 }, "layer_6_decay": { "param_names": [ "backbone.branch1.blocks.5.attn.qkv.weight", "backbone.branch1.blocks.5.attn.proj.weight", "backbone.branch1.blocks.5.mlp.fc1.weight", "backbone.branch1.blocks.5.mlp.fc2.weight", "backbone.branch2.blocks.2.attn.qkv.weight", "backbone.branch2.blocks.2.attn.proj.weight", "backbone.branch2.blocks.2.mlp.fc1.weight", "backbone.branch2.blocks.2.mlp.fc2.weight", "backbone.branch3.blocks.2.attn.qkv.weight", "backbone.branch3.blocks.2.attn.proj.weight", "backbone.branch3.blocks.2.mlp.fc1.weight", "backbone.branch3.blocks.2.mlp.fc2.weight" ], "lr_scale": 0.04559944833472275, "lr": 4.5599448334722756e-06, "weight_decay": 0.05 }, "layer_7_no_decay": { "param_names": [ "backbone.branch1.blocks.6.gamma_1", "backbone.branch1.blocks.6.gamma_2", "backbone.branch1.blocks.6.norm1.weight", "backbone.branch1.blocks.6.norm1.bias", "backbone.branch1.blocks.6.attn.qkv.bias", "backbone.branch1.blocks.6.attn.proj.bias", "backbone.branch1.blocks.6.norm2.weight", "backbone.branch1.blocks.6.norm2.bias", "backbone.branch1.blocks.6.mlp.fc1.bias", "backbone.branch1.blocks.6.mlp.fc2.bias" ], "lr_scale": 0.053646409805556176, "lr": 5.364640980555618e-06, "weight_decay": 0.0 }, "layer_7_decay": { "param_names": [ "backbone.branch1.blocks.6.attn.qkv.weight", "backbone.branch1.blocks.6.attn.proj.weight", "backbone.branch1.blocks.6.mlp.fc1.weight", "backbone.branch1.blocks.6.mlp.fc2.weight" ], "lr_scale": 0.053646409805556176, "lr": 5.364640980555618e-06, "weight_decay": 0.05 }, "layer_8_no_decay": { "param_names": [ "backbone.branch1.blocks.7.gamma_1", "backbone.branch1.blocks.7.gamma_2", "backbone.branch1.blocks.7.norm1.weight", "backbone.branch1.blocks.7.norm1.bias", "backbone.branch1.blocks.7.attn.qkv.bias", "backbone.branch1.blocks.7.attn.proj.bias", "backbone.branch1.blocks.7.norm2.weight", "backbone.branch1.blocks.7.norm2.bias", "backbone.branch1.blocks.7.mlp.fc1.bias", "backbone.branch1.blocks.7.mlp.fc2.bias", "backbone.branch2.blocks.3.gamma_1", "backbone.branch2.blocks.3.gamma_2", "backbone.branch2.blocks.3.norm1.weight", "backbone.branch2.blocks.3.norm1.bias", "backbone.branch2.blocks.3.attn.qkv.bias", "backbone.branch2.blocks.3.attn.proj.bias", "backbone.branch2.blocks.3.norm2.weight", "backbone.branch2.blocks.3.norm2.bias", "backbone.branch2.blocks.3.mlp.fc1.bias", "backbone.branch2.blocks.3.mlp.fc2.bias", "backbone.branch3.blocks.3.gamma_1", "backbone.branch3.blocks.3.gamma_2", "backbone.branch3.blocks.3.norm1.weight", "backbone.branch3.blocks.3.norm1.bias", "backbone.branch3.blocks.3.attn.qkv.bias", "backbone.branch3.blocks.3.attn.proj.bias", "backbone.branch3.blocks.3.norm2.weight", "backbone.branch3.blocks.3.norm2.bias", "backbone.branch3.blocks.3.mlp.fc1.bias", "backbone.branch3.blocks.3.mlp.fc2.bias" ], "lr_scale": 0.06311342330065432, "lr": 6.3113423300654325e-06, "weight_decay": 0.0 }, "layer_8_decay": { "param_names": [ "backbone.branch1.blocks.7.attn.qkv.weight", "backbone.branch1.blocks.7.attn.proj.weight", "backbone.branch1.blocks.7.mlp.fc1.weight", "backbone.branch1.blocks.7.mlp.fc2.weight", "backbone.branch2.blocks.3.attn.qkv.weight", "backbone.branch2.blocks.3.attn.proj.weight", "backbone.branch2.blocks.3.mlp.fc1.weight", "backbone.branch2.blocks.3.mlp.fc2.weight", "backbone.branch3.blocks.3.attn.qkv.weight", "backbone.branch3.blocks.3.attn.proj.weight", "backbone.branch3.blocks.3.mlp.fc1.weight", "backbone.branch3.blocks.3.mlp.fc2.weight" ], "lr_scale": 0.06311342330065432, "lr": 6.3113423300654325e-06, "weight_decay": 0.05 }, "layer_9_no_decay": { "param_names": [ "backbone.branch1.blocks.8.gamma_1", "backbone.branch1.blocks.8.gamma_2", "backbone.branch1.blocks.8.norm1.weight", "backbone.branch1.blocks.8.norm1.bias", "backbone.branch1.blocks.8.attn.qkv.bias", "backbone.branch1.blocks.8.attn.proj.bias", "backbone.branch1.blocks.8.norm2.weight", "backbone.branch1.blocks.8.norm2.bias", "backbone.branch1.blocks.8.mlp.fc1.bias", "backbone.branch1.blocks.8.mlp.fc2.bias" ], "lr_scale": 0.07425108623606391, "lr": 7.425108623606392e-06, "weight_decay": 0.0 }, "layer_9_decay": { "param_names": [ "backbone.branch1.blocks.8.attn.qkv.weight", "backbone.branch1.blocks.8.attn.proj.weight", "backbone.branch1.blocks.8.mlp.fc1.weight", "backbone.branch1.blocks.8.mlp.fc2.weight" ], "lr_scale": 0.07425108623606391, "lr": 7.425108623606392e-06, "weight_decay": 0.05 }, "layer_10_no_decay": { "param_names": [ "backbone.branch1.blocks.9.gamma_1", "backbone.branch1.blocks.9.gamma_2", "backbone.branch1.blocks.9.norm1.weight", "backbone.branch1.blocks.9.norm1.bias", "backbone.branch1.blocks.9.attn.qkv.bias", "backbone.branch1.blocks.9.attn.proj.bias", "backbone.branch1.blocks.9.norm2.weight", "backbone.branch1.blocks.9.norm2.bias", "backbone.branch1.blocks.9.mlp.fc1.bias", "backbone.branch1.blocks.9.mlp.fc2.bias", "backbone.branch2.blocks.4.gamma_1", "backbone.branch2.blocks.4.gamma_2", "backbone.branch2.blocks.4.norm1.weight", "backbone.branch2.blocks.4.norm1.bias", "backbone.branch2.blocks.4.attn.qkv.bias", "backbone.branch2.blocks.4.attn.proj.bias", "backbone.branch2.blocks.4.norm2.weight", "backbone.branch2.blocks.4.norm2.bias", "backbone.branch2.blocks.4.mlp.fc1.bias", "backbone.branch2.blocks.4.mlp.fc2.bias", "backbone.branch3.blocks.4.gamma_1", "backbone.branch3.blocks.4.gamma_2", "backbone.branch3.blocks.4.norm1.weight", "backbone.branch3.blocks.4.norm1.bias", "backbone.branch3.blocks.4.attn.qkv.bias", "backbone.branch3.blocks.4.attn.proj.bias", "backbone.branch3.blocks.4.norm2.weight", "backbone.branch3.blocks.4.norm2.bias", "backbone.branch3.blocks.4.mlp.fc1.bias", "backbone.branch3.blocks.4.mlp.fc2.bias" ], "lr_scale": 0.08735421910125167, "lr": 8.735421910125167e-06, "weight_decay": 0.0 }, "layer_10_decay": { "param_names": [ "backbone.branch1.blocks.9.attn.qkv.weight", "backbone.branch1.blocks.9.attn.proj.weight", "backbone.branch1.blocks.9.mlp.fc1.weight", "backbone.branch1.blocks.9.mlp.fc2.weight", "backbone.branch2.blocks.4.attn.qkv.weight", "backbone.branch2.blocks.4.attn.proj.weight", "backbone.branch2.blocks.4.mlp.fc1.weight", "backbone.branch2.blocks.4.mlp.fc2.weight", "backbone.branch3.blocks.4.attn.qkv.weight", "backbone.branch3.blocks.4.attn.proj.weight", "backbone.branch3.blocks.4.mlp.fc1.weight", "backbone.branch3.blocks.4.mlp.fc2.weight" ], "lr_scale": 0.08735421910125167, "lr": 8.735421910125167e-06, "weight_decay": 0.05 }, "layer_11_no_decay": { "param_names": [ "backbone.branch1.blocks.10.gamma_1", "backbone.branch1.blocks.10.gamma_2", "backbone.branch1.blocks.10.norm1.weight", "backbone.branch1.blocks.10.norm1.bias", "backbone.branch1.blocks.10.attn.qkv.bias", "backbone.branch1.blocks.10.attn.proj.bias", "backbone.branch1.blocks.10.norm2.weight", "backbone.branch1.blocks.10.norm2.bias", "backbone.branch1.blocks.10.mlp.fc1.bias", "backbone.branch1.blocks.10.mlp.fc2.bias" ], "lr_scale": 0.10276966953088432, "lr": 1.0276966953088432e-05, "weight_decay": 0.0 }, "layer_11_decay": { "param_names": [ "backbone.branch1.blocks.10.attn.qkv.weight", "backbone.branch1.blocks.10.attn.proj.weight", "backbone.branch1.blocks.10.mlp.fc1.weight", "backbone.branch1.blocks.10.mlp.fc2.weight" ], "lr_scale": 0.10276966953088432, "lr": 1.0276966953088432e-05, "weight_decay": 0.05 }, "layer_12_no_decay": { "param_names": [ "backbone.branch1.blocks.11.gamma_1", "backbone.branch1.blocks.11.gamma_2", "backbone.branch1.blocks.11.norm1.weight", "backbone.branch1.blocks.11.norm1.bias", "backbone.branch1.blocks.11.attn.qkv.bias", "backbone.branch1.blocks.11.attn.proj.bias", "backbone.branch1.blocks.11.norm2.weight", "backbone.branch1.blocks.11.norm2.bias", "backbone.branch1.blocks.11.mlp.fc1.bias", "backbone.branch1.blocks.11.mlp.fc2.bias", "backbone.branch2.blocks.5.gamma_1", "backbone.branch2.blocks.5.gamma_2", "backbone.branch2.blocks.5.norm1.weight", "backbone.branch2.blocks.5.norm1.bias", "backbone.branch2.blocks.5.attn.qkv.bias", "backbone.branch2.blocks.5.attn.proj.bias", "backbone.branch2.blocks.5.norm2.weight", "backbone.branch2.blocks.5.norm2.bias", "backbone.branch2.blocks.5.mlp.fc1.bias", "backbone.branch2.blocks.5.mlp.fc2.bias", "backbone.branch3.blocks.5.gamma_1", "backbone.branch3.blocks.5.gamma_2", "backbone.branch3.blocks.5.norm1.weight", "backbone.branch3.blocks.5.norm1.bias", "backbone.branch3.blocks.5.attn.qkv.bias", "backbone.branch3.blocks.5.attn.proj.bias", "backbone.branch3.blocks.5.norm2.weight", "backbone.branch3.blocks.5.norm2.bias", "backbone.branch3.blocks.5.mlp.fc1.bias", "backbone.branch3.blocks.5.mlp.fc2.bias" ], "lr_scale": 0.12090549356574626, "lr": 1.2090549356574626e-05, "weight_decay": 0.0 }, "layer_12_decay": { "param_names": [ "backbone.branch1.blocks.11.attn.qkv.weight", "backbone.branch1.blocks.11.attn.proj.weight", "backbone.branch1.blocks.11.mlp.fc1.weight", "backbone.branch1.blocks.11.mlp.fc2.weight", "backbone.branch2.blocks.5.attn.qkv.weight", "backbone.branch2.blocks.5.attn.proj.weight", "backbone.branch2.blocks.5.mlp.fc1.weight", "backbone.branch2.blocks.5.mlp.fc2.weight", "backbone.branch3.blocks.5.attn.qkv.weight", "backbone.branch3.blocks.5.attn.proj.weight", "backbone.branch3.blocks.5.mlp.fc1.weight", "backbone.branch3.blocks.5.mlp.fc2.weight" ], "lr_scale": 0.12090549356574626, "lr": 1.2090549356574626e-05, "weight_decay": 0.05 }, "layer_13_no_decay": { "param_names": [ "backbone.branch1.blocks.12.gamma_1", "backbone.branch1.blocks.12.gamma_2", "backbone.branch1.blocks.12.norm1.weight", "backbone.branch1.blocks.12.norm1.bias", "backbone.branch1.blocks.12.attn.qkv.bias", "backbone.branch1.blocks.12.attn.proj.bias", "backbone.branch1.blocks.12.norm2.weight", "backbone.branch1.blocks.12.norm2.bias", "backbone.branch1.blocks.12.mlp.fc1.bias", "backbone.branch1.blocks.12.mlp.fc2.bias" ], "lr_scale": 0.14224175713617207, "lr": 1.4224175713617208e-05, "weight_decay": 0.0 }, "layer_13_decay": { "param_names": [ "backbone.branch1.blocks.12.attn.qkv.weight", "backbone.branch1.blocks.12.attn.proj.weight", "backbone.branch1.blocks.12.mlp.fc1.weight", "backbone.branch1.blocks.12.mlp.fc2.weight" ], "lr_scale": 0.14224175713617207, "lr": 1.4224175713617208e-05, "weight_decay": 0.05 }, "layer_14_no_decay": { "param_names": [ "backbone.branch1.blocks.13.gamma_1", "backbone.branch1.blocks.13.gamma_2", "backbone.branch1.blocks.13.norm1.weight", "backbone.branch1.blocks.13.norm1.bias", "backbone.branch1.blocks.13.attn.qkv.bias", "backbone.branch1.blocks.13.attn.proj.bias", "backbone.branch1.blocks.13.norm2.weight", "backbone.branch1.blocks.13.norm2.bias", "backbone.branch1.blocks.13.mlp.fc1.bias", "backbone.branch1.blocks.13.mlp.fc2.bias", "backbone.branch2.blocks.6.gamma_1", "backbone.branch2.blocks.6.gamma_2", "backbone.branch2.blocks.6.norm1.weight", "backbone.branch2.blocks.6.norm1.bias", "backbone.branch2.blocks.6.attn.qkv.bias", "backbone.branch2.blocks.6.attn.proj.bias", "backbone.branch2.blocks.6.norm2.weight", "backbone.branch2.blocks.6.norm2.bias", "backbone.branch2.blocks.6.mlp.fc1.bias", "backbone.branch2.blocks.6.mlp.fc2.bias", "backbone.branch3.blocks.6.gamma_1", "backbone.branch3.blocks.6.gamma_2", "backbone.branch3.blocks.6.norm1.weight", "backbone.branch3.blocks.6.norm1.bias", "backbone.branch3.blocks.6.attn.qkv.bias", "backbone.branch3.blocks.6.attn.proj.bias", "backbone.branch3.blocks.6.norm2.weight", "backbone.branch3.blocks.6.norm2.bias", "backbone.branch3.blocks.6.mlp.fc1.bias", "backbone.branch3.blocks.6.mlp.fc2.bias" ], "lr_scale": 0.1673432436896142, "lr": 1.673432436896142e-05, "weight_decay": 0.0 }, "layer_14_decay": { "param_names": [ "backbone.branch1.blocks.13.attn.qkv.weight", "backbone.branch1.blocks.13.attn.proj.weight", "backbone.branch1.blocks.13.mlp.fc1.weight", "backbone.branch1.blocks.13.mlp.fc2.weight", "backbone.branch2.blocks.6.attn.qkv.weight", "backbone.branch2.blocks.6.attn.proj.weight", "backbone.branch2.blocks.6.mlp.fc1.weight", "backbone.branch2.blocks.6.mlp.fc2.weight", "backbone.branch3.blocks.6.attn.qkv.weight", "backbone.branch3.blocks.6.attn.proj.weight", "backbone.branch3.blocks.6.mlp.fc1.weight", "backbone.branch3.blocks.6.mlp.fc2.weight" ], "lr_scale": 0.1673432436896142, "lr": 1.673432436896142e-05, "weight_decay": 0.05 }, "layer_15_no_decay": { "param_names": [ "backbone.branch1.blocks.14.gamma_1", "backbone.branch1.blocks.14.gamma_2", "backbone.branch1.blocks.14.norm1.weight", "backbone.branch1.blocks.14.norm1.bias", "backbone.branch1.blocks.14.attn.qkv.bias", "backbone.branch1.blocks.14.attn.proj.bias", "backbone.branch1.blocks.14.norm2.weight", "backbone.branch1.blocks.14.norm2.bias", "backbone.branch1.blocks.14.mlp.fc1.bias", "backbone.branch1.blocks.14.mlp.fc2.bias" ], "lr_scale": 0.1968744043407226, "lr": 1.968744043407226e-05, "weight_decay": 0.0 }, "layer_15_decay": { "param_names": [ "backbone.branch1.blocks.14.attn.qkv.weight", "backbone.branch1.blocks.14.attn.proj.weight", "backbone.branch1.blocks.14.mlp.fc1.weight", "backbone.branch1.blocks.14.mlp.fc2.weight" ], "lr_scale": 0.1968744043407226, "lr": 1.968744043407226e-05, "weight_decay": 0.05 }, "layer_16_no_decay": { "param_names": [ "backbone.branch1.blocks.15.gamma_1", "backbone.branch1.blocks.15.gamma_2", "backbone.branch1.blocks.15.norm1.weight", "backbone.branch1.blocks.15.norm1.bias", "backbone.branch1.blocks.15.attn.qkv.bias", "backbone.branch1.blocks.15.attn.proj.bias", "backbone.branch1.blocks.15.norm2.weight", "backbone.branch1.blocks.15.norm2.bias", "backbone.branch1.blocks.15.mlp.fc1.bias", "backbone.branch1.blocks.15.mlp.fc2.bias", "backbone.branch2.blocks.7.gamma_1", "backbone.branch2.blocks.7.gamma_2", "backbone.branch2.blocks.7.norm1.weight", "backbone.branch2.blocks.7.norm1.bias", "backbone.branch2.blocks.7.attn.qkv.bias", "backbone.branch2.blocks.7.attn.proj.bias", "backbone.branch2.blocks.7.norm2.weight", "backbone.branch2.blocks.7.norm2.bias", "backbone.branch2.blocks.7.mlp.fc1.bias", "backbone.branch2.blocks.7.mlp.fc2.bias", "backbone.branch3.blocks.7.gamma_1", "backbone.branch3.blocks.7.gamma_2", "backbone.branch3.blocks.7.norm1.weight", "backbone.branch3.blocks.7.norm1.bias", "backbone.branch3.blocks.7.attn.qkv.bias", "backbone.branch3.blocks.7.attn.proj.bias", "backbone.branch3.blocks.7.norm2.weight", "backbone.branch3.blocks.7.norm2.bias", "backbone.branch3.blocks.7.mlp.fc1.bias", "backbone.branch3.blocks.7.mlp.fc2.bias" ], "lr_scale": 0.23161694628320306, "lr": 2.3161694628320308e-05, "weight_decay": 0.0 }, "layer_16_decay": { "param_names": [ "backbone.branch1.blocks.15.attn.qkv.weight", "backbone.branch1.blocks.15.attn.proj.weight", "backbone.branch1.blocks.15.mlp.fc1.weight", "backbone.branch1.blocks.15.mlp.fc2.weight", "backbone.branch2.blocks.7.attn.qkv.weight", "backbone.branch2.blocks.7.attn.proj.weight", "backbone.branch2.blocks.7.mlp.fc1.weight", "backbone.branch2.blocks.7.mlp.fc2.weight", "backbone.branch3.blocks.7.attn.qkv.weight", "backbone.branch3.blocks.7.attn.proj.weight", "backbone.branch3.blocks.7.mlp.fc1.weight", "backbone.branch3.blocks.7.mlp.fc2.weight" ], "lr_scale": 0.23161694628320306, "lr": 2.3161694628320308e-05, "weight_decay": 0.05 }, "layer_17_no_decay": { "param_names": [ "backbone.branch1.blocks.16.gamma_1", "backbone.branch1.blocks.16.gamma_2", "backbone.branch1.blocks.16.norm1.weight", "backbone.branch1.blocks.16.norm1.bias", "backbone.branch1.blocks.16.attn.qkv.bias", "backbone.branch1.blocks.16.attn.proj.bias", "backbone.branch1.blocks.16.norm2.weight", "backbone.branch1.blocks.16.norm2.bias", "backbone.branch1.blocks.16.mlp.fc1.bias", "backbone.branch1.blocks.16.mlp.fc2.bias" ], "lr_scale": 0.27249052503906246, "lr": 2.7249052503906248e-05, "weight_decay": 0.0 }, "layer_17_decay": { "param_names": [ "backbone.branch1.blocks.16.attn.qkv.weight", "backbone.branch1.blocks.16.attn.proj.weight", "backbone.branch1.blocks.16.mlp.fc1.weight", "backbone.branch1.blocks.16.mlp.fc2.weight" ], "lr_scale": 0.27249052503906246, "lr": 2.7249052503906248e-05, "weight_decay": 0.05 }, "layer_18_no_decay": { "param_names": [ "backbone.branch1.blocks.17.gamma_1", "backbone.branch1.blocks.17.gamma_2", "backbone.branch1.blocks.17.norm1.weight", "backbone.branch1.blocks.17.norm1.bias", "backbone.branch1.blocks.17.attn.qkv.bias", "backbone.branch1.blocks.17.attn.proj.bias", "backbone.branch1.blocks.17.norm2.weight", "backbone.branch1.blocks.17.norm2.bias", "backbone.branch1.blocks.17.mlp.fc1.bias", "backbone.branch1.blocks.17.mlp.fc2.bias", "backbone.branch2.blocks.8.gamma_1", "backbone.branch2.blocks.8.gamma_2", "backbone.branch2.blocks.8.norm1.weight", "backbone.branch2.blocks.8.norm1.bias", "backbone.branch2.blocks.8.attn.qkv.bias", "backbone.branch2.blocks.8.attn.proj.bias", "backbone.branch2.blocks.8.norm2.weight", "backbone.branch2.blocks.8.norm2.bias", "backbone.branch2.blocks.8.mlp.fc1.bias", "backbone.branch2.blocks.8.mlp.fc2.bias", "backbone.branch3.blocks.8.gamma_1", "backbone.branch3.blocks.8.gamma_2", "backbone.branch3.blocks.8.norm1.weight", "backbone.branch3.blocks.8.norm1.bias", "backbone.branch3.blocks.8.attn.qkv.bias", "backbone.branch3.blocks.8.attn.proj.bias", "backbone.branch3.blocks.8.norm2.weight", "backbone.branch3.blocks.8.norm2.bias", "backbone.branch3.blocks.8.mlp.fc1.bias", "backbone.branch3.blocks.8.mlp.fc2.bias" ], "lr_scale": 0.3205770882812499, "lr": 3.2057708828124995e-05, "weight_decay": 0.0 }, "layer_18_decay": { "param_names": [ "backbone.branch1.blocks.17.attn.qkv.weight", "backbone.branch1.blocks.17.attn.proj.weight", "backbone.branch1.blocks.17.mlp.fc1.weight", "backbone.branch1.blocks.17.mlp.fc2.weight", "backbone.branch2.blocks.8.attn.qkv.weight", "backbone.branch2.blocks.8.attn.proj.weight", "backbone.branch2.blocks.8.mlp.fc1.weight", "backbone.branch2.blocks.8.mlp.fc2.weight", "backbone.branch3.blocks.8.attn.qkv.weight", "backbone.branch3.blocks.8.attn.proj.weight", "backbone.branch3.blocks.8.mlp.fc1.weight", "backbone.branch3.blocks.8.mlp.fc2.weight" ], "lr_scale": 0.3205770882812499, "lr": 3.2057708828124995e-05, "weight_decay": 0.05 }, "layer_19_no_decay": { "param_names": [ "backbone.branch1.blocks.18.gamma_1", "backbone.branch1.blocks.18.gamma_2", "backbone.branch1.blocks.18.norm1.weight", "backbone.branch1.blocks.18.norm1.bias", "backbone.branch1.blocks.18.attn.qkv.bias", "backbone.branch1.blocks.18.attn.proj.bias", "backbone.branch1.blocks.18.norm2.weight", "backbone.branch1.blocks.18.norm2.bias", "backbone.branch1.blocks.18.mlp.fc1.bias", "backbone.branch1.blocks.18.mlp.fc2.bias" ], "lr_scale": 0.37714951562499993, "lr": 3.77149515625e-05, "weight_decay": 0.0 }, "layer_19_decay": { "param_names": [ "backbone.branch1.blocks.18.attn.qkv.weight", "backbone.branch1.blocks.18.attn.proj.weight", "backbone.branch1.blocks.18.mlp.fc1.weight", "backbone.branch1.blocks.18.mlp.fc2.weight" ], "lr_scale": 0.37714951562499993, "lr": 3.77149515625e-05, "weight_decay": 0.05 }, "layer_20_no_decay": { "param_names": [ "backbone.branch1.blocks.19.gamma_1", "backbone.branch1.blocks.19.gamma_2", "backbone.branch1.blocks.19.norm1.weight", "backbone.branch1.blocks.19.norm1.bias", "backbone.branch1.blocks.19.attn.qkv.bias", "backbone.branch1.blocks.19.attn.proj.bias", "backbone.branch1.blocks.19.norm2.weight", "backbone.branch1.blocks.19.norm2.bias", "backbone.branch1.blocks.19.mlp.fc1.bias", "backbone.branch1.blocks.19.mlp.fc2.bias", "backbone.branch2.blocks.9.gamma_1", "backbone.branch2.blocks.9.gamma_2", "backbone.branch2.blocks.9.norm1.weight", "backbone.branch2.blocks.9.norm1.bias", "backbone.branch2.blocks.9.attn.qkv.bias", "backbone.branch2.blocks.9.attn.proj.bias", "backbone.branch2.blocks.9.norm2.weight", "backbone.branch2.blocks.9.norm2.bias", "backbone.branch2.blocks.9.mlp.fc1.bias", "backbone.branch2.blocks.9.mlp.fc2.bias", "backbone.branch3.blocks.9.gamma_1", "backbone.branch3.blocks.9.gamma_2", "backbone.branch3.blocks.9.norm1.weight", "backbone.branch3.blocks.9.norm1.bias", "backbone.branch3.blocks.9.attn.qkv.bias", "backbone.branch3.blocks.9.attn.proj.bias", "backbone.branch3.blocks.9.norm2.weight", "backbone.branch3.blocks.9.norm2.bias", "backbone.branch3.blocks.9.mlp.fc1.bias", "backbone.branch3.blocks.9.mlp.fc2.bias" ], "lr_scale": 0.44370531249999995, "lr": 4.4370531249999995e-05, "weight_decay": 0.0 }, "layer_20_decay": { "param_names": [ "backbone.branch1.blocks.19.attn.qkv.weight", "backbone.branch1.blocks.19.attn.proj.weight", "backbone.branch1.blocks.19.mlp.fc1.weight", "backbone.branch1.blocks.19.mlp.fc2.weight", "backbone.branch2.blocks.9.attn.qkv.weight", "backbone.branch2.blocks.9.attn.proj.weight", "backbone.branch2.blocks.9.mlp.fc1.weight", "backbone.branch2.blocks.9.mlp.fc2.weight", "backbone.branch3.blocks.9.attn.qkv.weight", "backbone.branch3.blocks.9.attn.proj.weight", "backbone.branch3.blocks.9.mlp.fc1.weight", "backbone.branch3.blocks.9.mlp.fc2.weight" ], "lr_scale": 0.44370531249999995, "lr": 4.4370531249999995e-05, "weight_decay": 0.05 }, "layer_21_no_decay": { "param_names": [ "backbone.branch1.blocks.20.gamma_1", "backbone.branch1.blocks.20.gamma_2", "backbone.branch1.blocks.20.norm1.weight", "backbone.branch1.blocks.20.norm1.bias", "backbone.branch1.blocks.20.attn.qkv.bias", "backbone.branch1.blocks.20.attn.proj.bias", "backbone.branch1.blocks.20.norm2.weight", "backbone.branch1.blocks.20.norm2.bias", "backbone.branch1.blocks.20.mlp.fc1.bias", "backbone.branch1.blocks.20.mlp.fc2.bias" ], "lr_scale": 0.5220062499999999, "lr": 5.220062499999999e-05, "weight_decay": 0.0 }, "layer_21_decay": { "param_names": [ "backbone.branch1.blocks.20.attn.qkv.weight", "backbone.branch1.blocks.20.attn.proj.weight", "backbone.branch1.blocks.20.mlp.fc1.weight", "backbone.branch1.blocks.20.mlp.fc2.weight" ], "lr_scale": 0.5220062499999999, "lr": 5.220062499999999e-05, "weight_decay": 0.05 }, "layer_22_no_decay": { "param_names": [ "backbone.branch1.blocks.21.gamma_1", "backbone.branch1.blocks.21.gamma_2", "backbone.branch1.blocks.21.norm1.weight", "backbone.branch1.blocks.21.norm1.bias", "backbone.branch1.blocks.21.attn.qkv.bias", "backbone.branch1.blocks.21.attn.proj.bias", "backbone.branch1.blocks.21.norm2.weight", "backbone.branch1.blocks.21.norm2.bias", "backbone.branch1.blocks.21.mlp.fc1.bias", "backbone.branch1.blocks.21.mlp.fc2.bias", "backbone.branch2.blocks.10.gamma_1", "backbone.branch2.blocks.10.gamma_2", "backbone.branch2.blocks.10.norm1.weight", "backbone.branch2.blocks.10.norm1.bias", "backbone.branch2.blocks.10.attn.qkv.bias", "backbone.branch2.blocks.10.attn.proj.bias", "backbone.branch2.blocks.10.norm2.weight", "backbone.branch2.blocks.10.norm2.bias", "backbone.branch2.blocks.10.mlp.fc1.bias", "backbone.branch2.blocks.10.mlp.fc2.bias", "backbone.branch3.blocks.10.gamma_1", "backbone.branch3.blocks.10.gamma_2", "backbone.branch3.blocks.10.norm1.weight", "backbone.branch3.blocks.10.norm1.bias", "backbone.branch3.blocks.10.attn.qkv.bias", "backbone.branch3.blocks.10.attn.proj.bias", "backbone.branch3.blocks.10.norm2.weight", "backbone.branch3.blocks.10.norm2.bias", "backbone.branch3.blocks.10.mlp.fc1.bias", "backbone.branch3.blocks.10.mlp.fc2.bias" ], "lr_scale": 0.6141249999999999, "lr": 6.14125e-05, "weight_decay": 0.0 }, "layer_22_decay": { "param_names": [ "backbone.branch1.blocks.21.attn.qkv.weight", "backbone.branch1.blocks.21.attn.proj.weight", "backbone.branch1.blocks.21.mlp.fc1.weight", "backbone.branch1.blocks.21.mlp.fc2.weight", "backbone.branch2.blocks.10.attn.qkv.weight", "backbone.branch2.blocks.10.attn.proj.weight", "backbone.branch2.blocks.10.mlp.fc1.weight", "backbone.branch2.blocks.10.mlp.fc2.weight", "backbone.branch3.blocks.10.attn.qkv.weight", "backbone.branch3.blocks.10.attn.proj.weight", "backbone.branch3.blocks.10.mlp.fc1.weight", "backbone.branch3.blocks.10.mlp.fc2.weight" ], "lr_scale": 0.6141249999999999, "lr": 6.14125e-05, "weight_decay": 0.05 }, "layer_23_no_decay": { "param_names": [ "backbone.branch1.blocks.22.gamma_1", "backbone.branch1.blocks.22.gamma_2", "backbone.branch1.blocks.22.norm1.weight", "backbone.branch1.blocks.22.norm1.bias", "backbone.branch1.blocks.22.attn.qkv.bias", "backbone.branch1.blocks.22.attn.proj.bias", "backbone.branch1.blocks.22.norm2.weight", "backbone.branch1.blocks.22.norm2.bias", "backbone.branch1.blocks.22.mlp.fc1.bias", "backbone.branch1.blocks.22.mlp.fc2.bias" ], "lr_scale": 0.7224999999999999, "lr": 7.225e-05, "weight_decay": 0.0 }, "layer_23_decay": { "param_names": [ "backbone.branch1.blocks.22.attn.qkv.weight", "backbone.branch1.blocks.22.attn.proj.weight", "backbone.branch1.blocks.22.mlp.fc1.weight", "backbone.branch1.blocks.22.mlp.fc2.weight" ], "lr_scale": 0.7224999999999999, "lr": 7.225e-05, "weight_decay": 0.05 }, "layer_24_no_decay": { "param_names": [ "backbone.branch1.blocks.23.gamma_1", "backbone.branch1.blocks.23.gamma_2", "backbone.branch1.blocks.23.norm1.weight", "backbone.branch1.blocks.23.norm1.bias", "backbone.branch1.blocks.23.attn.qkv.bias", "backbone.branch1.blocks.23.attn.proj.bias", "backbone.branch1.blocks.23.norm2.weight", "backbone.branch1.blocks.23.norm2.bias", "backbone.branch1.blocks.23.mlp.fc1.bias", "backbone.branch1.blocks.23.mlp.fc2.bias", "backbone.branch2.blocks.11.gamma_1", "backbone.branch2.blocks.11.gamma_2", "backbone.branch2.blocks.11.norm1.weight", "backbone.branch2.blocks.11.norm1.bias", "backbone.branch2.blocks.11.attn.qkv.bias", "backbone.branch2.blocks.11.attn.proj.bias", "backbone.branch2.blocks.11.norm2.weight", "backbone.branch2.blocks.11.norm2.bias", "backbone.branch2.blocks.11.mlp.fc1.bias", "backbone.branch2.blocks.11.mlp.fc2.bias", "backbone.branch3.blocks.11.gamma_1", "backbone.branch3.blocks.11.gamma_2", "backbone.branch3.blocks.11.norm1.weight", "backbone.branch3.blocks.11.norm1.bias", "backbone.branch3.blocks.11.attn.qkv.bias", "backbone.branch3.blocks.11.attn.proj.bias", "backbone.branch3.blocks.11.norm2.weight", "backbone.branch3.blocks.11.norm2.bias", "backbone.branch3.blocks.11.mlp.fc1.bias", "backbone.branch3.blocks.11.mlp.fc2.bias" ], "lr_scale": 0.85, "lr": 8.5e-05, "weight_decay": 0.0 }, "layer_24_decay": { "param_names": [ "backbone.branch1.blocks.23.attn.qkv.weight", "backbone.branch1.blocks.23.attn.proj.weight", "backbone.branch1.blocks.23.mlp.fc1.weight", "backbone.branch1.blocks.23.mlp.fc2.weight", "backbone.branch2.blocks.11.attn.qkv.weight", "backbone.branch2.blocks.11.attn.proj.weight", "backbone.branch2.blocks.11.mlp.fc1.weight", "backbone.branch2.blocks.11.mlp.fc2.weight", "backbone.branch3.blocks.11.attn.qkv.weight", "backbone.branch3.blocks.11.attn.proj.weight", "backbone.branch3.blocks.11.mlp.fc1.weight", "backbone.branch3.blocks.11.mlp.fc2.weight" ], "lr_scale": 0.85, "lr": 8.5e-05, "weight_decay": 0.05 }, "layer_25_no_decay": { "param_names": [ "backbone.interactions.0.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.merge_branch1.1.weight", "backbone.merge_branch1.1.bias", "backbone.merge_branch1.4.weight", "backbone.merge_branch1.4.bias", "backbone.merge_branch2.1.weight", "backbone.merge_branch2.1.bias", "backbone.merge_branch2.4.weight", "backbone.merge_branch2.4.bias", "backbone.merge_branch3.1.weight", "backbone.merge_branch3.1.bias", "backbone.merge_branch3.4.weight", "backbone.merge_branch3.4.bias", "backbone.fpn1.0.bias", "backbone.fpn1.1.weight", "backbone.fpn1.1.bias", "backbone.fpn1.3.bias", "backbone.fpn2.0.bias", "neck.lateral_convs.0.conv.bias", "neck.lateral_convs.1.conv.bias", "neck.lateral_convs.2.conv.bias", "neck.lateral_convs.3.conv.bias", "neck.fpn_convs.0.conv.bias", "neck.fpn_convs.1.conv.bias", "neck.fpn_convs.2.conv.bias", "neck.fpn_convs.3.conv.bias", "rpn_head.rpn_conv.bias", "rpn_head.rpn_cls.bias", "rpn_head.rpn_reg.bias", "roi_head.bbox_head.fc_cls.bias", "roi_head.bbox_head.fc_reg.bias", "roi_head.bbox_head.shared_fcs.0.bias", "roi_head.bbox_head.shared_fcs.1.bias", "roi_head.mask_head.convs.0.conv.bias", "roi_head.mask_head.convs.1.conv.bias", "roi_head.mask_head.convs.2.conv.bias", "roi_head.mask_head.convs.3.conv.bias", "roi_head.mask_head.upsample.bias", "roi_head.mask_head.conv_logits.bias" ], "lr_scale": 1.0, "lr": 0.0001, "weight_decay": 0.0 } } 2024-05-28 09:22:53,724 - mmdet - INFO - Automatic scaling of learning rate (LR) has been disabled. 2024-05-28 09:22:54,129 - mmdet - INFO - Start running, work_dir: /mnt/petrelfs/PIIP/mmdetection/work_dirs/mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16 2024-05-28 09:22:54,129 - mmdet - INFO - Hooks will be executed in the following order: before_run: (VERY_HIGH ) StepLrUpdaterHook (49 ) ToBFloat16HookMMDet (NORMAL ) DeepspeedCheckpointHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_epoch: (VERY_HIGH ) StepLrUpdaterHook (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_iter: (VERY_HIGH ) StepLrUpdaterHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook -------------------- after_train_iter: (ABOVE_NORMAL) OptimizerHook (NORMAL ) DeepspeedCheckpointHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- after_train_epoch: (NORMAL ) DeepspeedCheckpointHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_val_epoch: (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (VERY_LOW ) TextLoggerHook -------------------- before_val_iter: (LOW ) IterTimerHook -------------------- after_val_iter: (LOW ) IterTimerHook -------------------- after_val_epoch: (VERY_LOW ) TextLoggerHook -------------------- after_run: (VERY_LOW ) TextLoggerHook -------------------- 2024-05-28 09:22:54,129 - mmdet - INFO - workflow: [('train', 1)], max: 12 epochs 2024-05-28 09:22:54,140 - mmdet - INFO - Checkpoints will be saved to /mnt/petrelfs/PIIP/mmdetection/work_dirs/mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16 by HardDiskBackend. 2024-05-28 09:23:47,517 - mmdet - INFO - Epoch [1][50/7330] lr: 9.890e-06, eta: 1 day, 2:03:48, time: 1.067, data_time: 0.131, memory: 20573, loss_rpn_cls: 0.5992, loss_rpn_bbox: 0.1291, loss_cls: 1.8390, acc: 65.4624, loss_bbox: 0.0130, loss_mask: 1.2414, loss: 3.8218 2024-05-28 09:24:35,803 - mmdet - INFO - Epoch [1][100/7330] lr: 1.988e-05, eta: 1 day, 0:48:32, time: 0.966, data_time: 0.121, memory: 20731, loss_rpn_cls: 0.3012, loss_rpn_bbox: 0.1039, loss_cls: 0.3336, acc: 96.3633, loss_bbox: 0.1010, loss_mask: 0.7530, loss: 1.5929 2024-05-28 09:25:20,435 - mmdet - INFO - Epoch [1][150/7330] lr: 2.987e-05, eta: 23:47:14, time: 0.893, data_time: 0.044, memory: 20731, loss_rpn_cls: 0.2453, loss_rpn_bbox: 0.1025, loss_cls: 0.3517, acc: 94.9824, loss_bbox: 0.1624, loss_mask: 0.6959, loss: 1.5578 2024-05-28 09:26:05,169 - mmdet - INFO - Epoch [1][200/7330] lr: 3.986e-05, eta: 23:16:57, time: 0.895, data_time: 0.049, memory: 20752, loss_rpn_cls: 0.2246, loss_rpn_bbox: 0.0982, loss_cls: 0.3055, acc: 95.3662, loss_bbox: 0.1492, loss_mask: 0.6840, loss: 1.4615 2024-05-28 09:26:56,708 - mmdet - INFO - Epoch [1][250/7330] lr: 4.985e-05, eta: 23:38:18, time: 1.031, data_time: 0.048, memory: 20752, loss_rpn_cls: 0.1994, loss_rpn_bbox: 0.0985, loss_cls: 0.3549, acc: 94.3813, loss_bbox: 0.1930, loss_mask: 0.6670, loss: 1.5128 2024-05-28 09:27:42,214 - mmdet - INFO - Epoch [1][300/7330] lr: 5.984e-05, eta: 23:22:51, time: 0.910, data_time: 0.044, memory: 20814, loss_rpn_cls: 0.1612, loss_rpn_bbox: 0.1006, loss_cls: 0.4112, acc: 93.3218, loss_bbox: 0.2339, loss_mask: 0.6462, loss: 1.5530 2024-05-28 09:28:26,689 - mmdet - INFO - Epoch [1][350/7330] lr: 6.983e-05, eta: 23:07:17, time: 0.889, data_time: 0.061, memory: 20894, loss_rpn_cls: 0.1413, loss_rpn_bbox: 0.0980, loss_cls: 0.4293, acc: 92.9563, loss_bbox: 0.2511, loss_mask: 0.6230, loss: 1.5426 2024-05-28 09:29:11,525 - mmdet - INFO - Epoch [1][400/7330] lr: 7.982e-05, eta: 22:56:46, time: 0.897, data_time: 0.054, memory: 20894, loss_rpn_cls: 0.1150, loss_rpn_bbox: 0.0941, loss_cls: 0.4405, acc: 92.2112, loss_bbox: 0.2827, loss_mask: 0.5940, loss: 1.5263 2024-05-28 09:29:56,909 - mmdet - INFO - Epoch [1][450/7330] lr: 8.981e-05, eta: 22:50:11, time: 0.908, data_time: 0.042, memory: 20894, loss_rpn_cls: 0.1111, loss_rpn_bbox: 0.0976, loss_cls: 0.4597, acc: 91.8220, loss_bbox: 0.2972, loss_mask: 0.5639, loss: 1.5295 2024-05-28 09:30:41,799 - mmdet - INFO - Epoch [1][500/7330] lr: 9.980e-05, eta: 22:43:20, time: 0.898, data_time: 0.047, memory: 20895, loss_rpn_cls: 0.1050, loss_rpn_bbox: 0.0961, loss_cls: 0.4383, acc: 91.5273, loss_bbox: 0.3017, loss_mask: 0.5369, loss: 1.4779 2024-05-28 09:31:27,638 - mmdet - INFO - Epoch [1][550/7330] lr: 1.000e-04, eta: 22:40:06, time: 0.917, data_time: 0.048, memory: 20895, loss_rpn_cls: 0.0988, loss_rpn_bbox: 0.0963, loss_cls: 0.4431, acc: 90.8647, loss_bbox: 0.3275, loss_mask: 0.5192, loss: 1.4850 2024-05-28 09:32:12,125 - mmdet - INFO - Epoch [1][600/7330] lr: 1.000e-04, eta: 22:33:59, time: 0.890, data_time: 0.047, memory: 20895, loss_rpn_cls: 0.0928, loss_rpn_bbox: 0.0912, loss_cls: 0.3961, acc: 91.5388, loss_bbox: 0.2978, loss_mask: 0.4976, loss: 1.3754 2024-05-28 09:32:57,286 - mmdet - INFO - Epoch [1][650/7330] lr: 1.000e-04, eta: 22:30:14, time: 0.903, data_time: 0.053, memory: 20895, loss_rpn_cls: 0.0875, loss_rpn_bbox: 0.0921, loss_cls: 0.4149, acc: 90.6887, loss_bbox: 0.3301, loss_mask: 0.4941, loss: 1.4186 2024-05-28 09:33:44,383 - mmdet - INFO - Epoch [1][700/7330] lr: 1.000e-04, eta: 22:30:55, time: 0.942, data_time: 0.046, memory: 20895, loss_rpn_cls: 0.0833, loss_rpn_bbox: 0.0872, loss_cls: 0.3902, acc: 91.1453, loss_bbox: 0.3131, loss_mask: 0.4724, loss: 1.3463 2024-05-28 09:34:29,674 - mmdet - INFO - Epoch [1][750/7330] lr: 1.000e-04, eta: 22:27:54, time: 0.906, data_time: 0.057, memory: 20895, loss_rpn_cls: 0.0825, loss_rpn_bbox: 0.0904, loss_cls: 0.3974, acc: 90.3809, loss_bbox: 0.3386, loss_mask: 0.4657, loss: 1.3746 2024-05-28 09:35:14,928 - mmdet - INFO - Epoch [1][800/7330] lr: 1.000e-04, eta: 22:25:06, time: 0.905, data_time: 0.043, memory: 20895, loss_rpn_cls: 0.0775, loss_rpn_bbox: 0.0839, loss_cls: 0.3821, acc: 90.6724, loss_bbox: 0.3315, loss_mask: 0.4524, loss: 1.3274 2024-05-28 09:35:59,349 - mmdet - INFO - Epoch [1][850/7330] lr: 1.000e-04, eta: 22:21:08, time: 0.888, data_time: 0.040, memory: 20895, loss_rpn_cls: 0.0684, loss_rpn_bbox: 0.0793, loss_cls: 0.3610, acc: 90.7666, loss_bbox: 0.3311, loss_mask: 0.4441, loss: 1.2839 2024-05-28 09:36:44,791 - mmdet - INFO - Epoch [1][900/7330] lr: 1.000e-04, eta: 22:19:09, time: 0.909, data_time: 0.054, memory: 20895, loss_rpn_cls: 0.0772, loss_rpn_bbox: 0.0898, loss_cls: 0.3717, acc: 90.2620, loss_bbox: 0.3440, loss_mask: 0.4507, loss: 1.3334 2024-05-28 09:37:29,736 - mmdet - INFO - Epoch [1][950/7330] lr: 1.000e-04, eta: 22:16:33, time: 0.899, data_time: 0.039, memory: 20895, loss_rpn_cls: 0.0712, loss_rpn_bbox: 0.0855, loss_cls: 0.3575, acc: 90.5713, loss_bbox: 0.3299, loss_mask: 0.4404, loss: 1.2844 2024-05-28 09:38:14,364 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 09:38:14,364 - mmdet - INFO - Epoch [1][1000/7330] lr: 1.000e-04, eta: 22:13:40, time: 0.893, data_time: 0.056, memory: 20912, loss_rpn_cls: 0.0683, loss_rpn_bbox: 0.0852, loss_cls: 0.3644, acc: 90.1389, loss_bbox: 0.3547, loss_mask: 0.4302, loss: 1.3028 2024-05-28 09:38:59,585 - mmdet - INFO - Epoch [1][1050/7330] lr: 1.000e-04, eta: 22:11:49, time: 0.904, data_time: 0.039, memory: 20914, loss_rpn_cls: 0.0716, loss_rpn_bbox: 0.0843, loss_cls: 0.3463, acc: 90.6318, loss_bbox: 0.3350, loss_mask: 0.4210, loss: 1.2582 2024-05-28 09:39:45,207 - mmdet - INFO - Epoch [1][1100/7330] lr: 1.000e-04, eta: 22:10:35, time: 0.912, data_time: 0.063, memory: 20914, loss_rpn_cls: 0.0710, loss_rpn_bbox: 0.0844, loss_cls: 0.3475, acc: 90.3494, loss_bbox: 0.3511, loss_mask: 0.4168, loss: 1.2708 2024-05-28 09:40:38,604 - mmdet - INFO - Epoch [1][1150/7330] lr: 1.000e-04, eta: 22:15:58, time: 1.017, data_time: 0.043, memory: 20914, loss_rpn_cls: 0.0704, loss_rpn_bbox: 0.0859, loss_cls: 0.3570, acc: 90.0598, loss_bbox: 0.3559, loss_mask: 0.4223, loss: 1.2915 2024-05-28 09:41:27,804 - mmdet - INFO - Epoch [1][1200/7330] lr: 1.000e-04, eta: 22:21:56, time: 1.035, data_time: 0.097, memory: 20914, loss_rpn_cls: 0.0649, loss_rpn_bbox: 0.0855, loss_cls: 0.3545, acc: 89.9592, loss_bbox: 0.3590, loss_mask: 0.4128, loss: 1.2767 2024-05-28 09:42:13,160 - mmdet - INFO - Epoch [1][1250/7330] lr: 1.000e-04, eta: 22:19:57, time: 0.907, data_time: 0.043, memory: 20914, loss_rpn_cls: 0.0671, loss_rpn_bbox: 0.0813, loss_cls: 0.3507, acc: 90.1514, loss_bbox: 0.3543, loss_mask: 0.4140, loss: 1.2674 2024-05-28 09:42:58,138 - mmdet - INFO - Epoch [1][1300/7330] lr: 1.000e-04, eta: 22:17:38, time: 0.900, data_time: 0.060, memory: 20914, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0827, loss_cls: 0.3485, acc: 89.8987, loss_bbox: 0.3651, loss_mask: 0.3999, loss: 1.2544 2024-05-28 09:43:43,274 - mmdet - INFO - Epoch [1][1350/7330] lr: 1.000e-04, eta: 22:15:37, time: 0.903, data_time: 0.047, memory: 20914, loss_rpn_cls: 0.0651, loss_rpn_bbox: 0.0821, loss_cls: 0.3568, acc: 89.8086, loss_bbox: 0.3613, loss_mask: 0.4065, loss: 1.2718 2024-05-28 09:44:28,705 - mmdet - INFO - Epoch [1][1400/7330] lr: 1.000e-04, eta: 22:13:59, time: 0.909, data_time: 0.057, memory: 20914, loss_rpn_cls: 0.0598, loss_rpn_bbox: 0.0843, loss_cls: 0.3388, acc: 90.1064, loss_bbox: 0.3512, loss_mask: 0.4056, loss: 1.2397 2024-05-28 09:45:13,897 - mmdet - INFO - Epoch [1][1450/7330] lr: 1.000e-04, eta: 22:12:11, time: 0.904, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0584, loss_rpn_bbox: 0.0756, loss_cls: 0.3352, acc: 90.2632, loss_bbox: 0.3471, loss_mask: 0.3891, loss: 1.2055 2024-05-28 09:45:59,592 - mmdet - INFO - Epoch [1][1500/7330] lr: 1.000e-04, eta: 22:10:55, time: 0.914, data_time: 0.052, memory: 20914, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0795, loss_cls: 0.3434, acc: 89.8953, loss_bbox: 0.3594, loss_mask: 0.3908, loss: 1.2324 2024-05-28 09:46:44,711 - mmdet - INFO - Epoch [1][1550/7330] lr: 1.000e-04, eta: 22:09:10, time: 0.902, data_time: 0.058, memory: 20914, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0799, loss_cls: 0.3501, acc: 89.9412, loss_bbox: 0.3579, loss_mask: 0.3984, loss: 1.2446 2024-05-28 09:47:32,208 - mmdet - INFO - Epoch [1][1600/7330] lr: 1.000e-04, eta: 22:09:37, time: 0.950, data_time: 0.054, memory: 20914, loss_rpn_cls: 0.0560, loss_rpn_bbox: 0.0775, loss_cls: 0.3383, acc: 90.0591, loss_bbox: 0.3520, loss_mask: 0.3851, loss: 1.2089 2024-05-28 09:48:17,091 - mmdet - INFO - Epoch [1][1650/7330] lr: 1.000e-04, eta: 22:07:42, time: 0.898, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0605, loss_rpn_bbox: 0.0798, loss_cls: 0.3331, acc: 90.1726, loss_bbox: 0.3507, loss_mask: 0.3911, loss: 1.2152 2024-05-28 09:49:02,219 - mmdet - INFO - Epoch [1][1700/7330] lr: 1.000e-04, eta: 22:06:04, time: 0.903, data_time: 0.053, memory: 20914, loss_rpn_cls: 0.0602, loss_rpn_bbox: 0.0812, loss_cls: 0.3341, acc: 90.0574, loss_bbox: 0.3528, loss_mask: 0.3893, loss: 1.2177 2024-05-28 09:49:47,377 - mmdet - INFO - Epoch [1][1750/7330] lr: 1.000e-04, eta: 22:04:31, time: 0.903, data_time: 0.050, memory: 20914, loss_rpn_cls: 0.0544, loss_rpn_bbox: 0.0761, loss_cls: 0.3226, acc: 90.3970, loss_bbox: 0.3436, loss_mask: 0.3787, loss: 1.1754 2024-05-28 09:50:32,641 - mmdet - INFO - Epoch [1][1800/7330] lr: 1.000e-04, eta: 22:03:05, time: 0.905, data_time: 0.053, memory: 20914, loss_rpn_cls: 0.0607, loss_rpn_bbox: 0.0782, loss_cls: 0.3229, acc: 90.3372, loss_bbox: 0.3402, loss_mask: 0.3808, loss: 1.1828 2024-05-28 09:51:18,329 - mmdet - INFO - Epoch [1][1850/7330] lr: 1.000e-04, eta: 22:02:01, time: 0.914, data_time: 0.063, memory: 20914, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0796, loss_cls: 0.3488, acc: 89.6145, loss_bbox: 0.3678, loss_mask: 0.3845, loss: 1.2387 2024-05-28 09:52:04,487 - mmdet - INFO - Epoch [1][1900/7330] lr: 1.000e-04, eta: 22:01:20, time: 0.923, data_time: 0.045, memory: 20914, loss_rpn_cls: 0.0570, loss_rpn_bbox: 0.0811, loss_cls: 0.3272, acc: 89.8979, loss_bbox: 0.3619, loss_mask: 0.3860, loss: 1.2132 2024-05-28 09:52:50,414 - mmdet - INFO - Epoch [1][1950/7330] lr: 1.000e-04, eta: 22:00:28, time: 0.919, data_time: 0.055, memory: 20914, loss_rpn_cls: 0.0546, loss_rpn_bbox: 0.0778, loss_cls: 0.3231, acc: 90.3357, loss_bbox: 0.3438, loss_mask: 0.3667, loss: 1.1660 2024-05-28 09:53:35,654 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 09:53:35,654 - mmdet - INFO - Epoch [1][2000/7330] lr: 1.000e-04, eta: 21:59:07, time: 0.905, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0779, loss_cls: 0.3189, acc: 90.1790, loss_bbox: 0.3499, loss_mask: 0.3722, loss: 1.1770 2024-05-28 09:54:22,606 - mmdet - INFO - Epoch [1][2050/7330] lr: 1.000e-04, eta: 21:58:59, time: 0.939, data_time: 0.059, memory: 20914, loss_rpn_cls: 0.0549, loss_rpn_bbox: 0.0812, loss_cls: 0.3275, acc: 89.9336, loss_bbox: 0.3584, loss_mask: 0.3683, loss: 1.1904 2024-05-28 09:55:19,013 - mmdet - INFO - Epoch [1][2100/7330] lr: 1.000e-04, eta: 22:05:16, time: 1.128, data_time: 0.043, memory: 20914, loss_rpn_cls: 0.0545, loss_rpn_bbox: 0.0779, loss_cls: 0.3151, acc: 90.4587, loss_bbox: 0.3425, loss_mask: 0.3637, loss: 1.1537 2024-05-28 09:56:03,801 - mmdet - INFO - Epoch [1][2150/7330] lr: 1.000e-04, eta: 22:03:29, time: 0.896, data_time: 0.041, memory: 20914, loss_rpn_cls: 0.0501, loss_rpn_bbox: 0.0768, loss_cls: 0.2974, acc: 90.8894, loss_bbox: 0.3299, loss_mask: 0.3566, loss: 1.1109 2024-05-28 09:56:49,361 - mmdet - INFO - Epoch [1][2200/7330] lr: 1.000e-04, eta: 22:02:15, time: 0.911, data_time: 0.058, memory: 20914, loss_rpn_cls: 0.0555, loss_rpn_bbox: 0.0800, loss_cls: 0.3260, acc: 89.9031, loss_bbox: 0.3581, loss_mask: 0.3712, loss: 1.1908 2024-05-28 09:57:35,348 - mmdet - INFO - Epoch [1][2250/7330] lr: 1.000e-04, eta: 22:01:18, time: 0.920, data_time: 0.066, memory: 20914, loss_rpn_cls: 0.0525, loss_rpn_bbox: 0.0771, loss_cls: 0.3171, acc: 90.4226, loss_bbox: 0.3450, loss_mask: 0.3593, loss: 1.1511 2024-05-28 09:58:20,755 - mmdet - INFO - Epoch [1][2300/7330] lr: 1.000e-04, eta: 22:00:01, time: 0.908, data_time: 0.045, memory: 20914, loss_rpn_cls: 0.0553, loss_rpn_bbox: 0.0751, loss_cls: 0.3154, acc: 90.3081, loss_bbox: 0.3426, loss_mask: 0.3590, loss: 1.1474 2024-05-28 09:59:05,806 - mmdet - INFO - Epoch [1][2350/7330] lr: 1.000e-04, eta: 21:58:32, time: 0.901, data_time: 0.041, memory: 20914, loss_rpn_cls: 0.0496, loss_rpn_bbox: 0.0737, loss_cls: 0.3125, acc: 90.4771, loss_bbox: 0.3355, loss_mask: 0.3547, loss: 1.1260 2024-05-28 09:59:51,094 - mmdet - INFO - Epoch [1][2400/7330] lr: 1.000e-04, eta: 21:57:13, time: 0.906, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0518, loss_rpn_bbox: 0.0777, loss_cls: 0.3337, acc: 89.6650, loss_bbox: 0.3639, loss_mask: 0.3599, loss: 1.1870 2024-05-28 10:00:36,144 - mmdet - INFO - Epoch [1][2450/7330] lr: 1.000e-04, eta: 21:55:47, time: 0.901, data_time: 0.054, memory: 20914, loss_rpn_cls: 0.0564, loss_rpn_bbox: 0.0776, loss_cls: 0.3175, acc: 90.1782, loss_bbox: 0.3503, loss_mask: 0.3604, loss: 1.1623 2024-05-28 10:01:23,065 - mmdet - INFO - Epoch [1][2500/7330] lr: 1.000e-04, eta: 21:55:27, time: 0.938, data_time: 0.061, memory: 20914, loss_rpn_cls: 0.0484, loss_rpn_bbox: 0.0690, loss_cls: 0.3036, acc: 90.5188, loss_bbox: 0.3380, loss_mask: 0.3584, loss: 1.1175 2024-05-28 10:02:07,992 - mmdet - INFO - Epoch [1][2550/7330] lr: 1.000e-04, eta: 21:53:59, time: 0.899, data_time: 0.052, memory: 20914, loss_rpn_cls: 0.0498, loss_rpn_bbox: 0.0758, loss_cls: 0.3003, acc: 90.6165, loss_bbox: 0.3318, loss_mask: 0.3533, loss: 1.1109 2024-05-28 10:02:53,018 - mmdet - INFO - Epoch [1][2600/7330] lr: 1.000e-04, eta: 21:52:35, time: 0.900, data_time: 0.045, memory: 20914, loss_rpn_cls: 0.0485, loss_rpn_bbox: 0.0729, loss_cls: 0.3082, acc: 90.2839, loss_bbox: 0.3477, loss_mask: 0.3516, loss: 1.1289 2024-05-28 10:03:37,827 - mmdet - INFO - Epoch [1][2650/7330] lr: 1.000e-04, eta: 21:51:07, time: 0.896, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0564, loss_rpn_bbox: 0.0753, loss_cls: 0.3079, acc: 90.3289, loss_bbox: 0.3389, loss_mask: 0.3573, loss: 1.1359 2024-05-28 10:04:22,692 - mmdet - INFO - Epoch [1][2700/7330] lr: 1.000e-04, eta: 21:49:41, time: 0.897, data_time: 0.056, memory: 20914, loss_rpn_cls: 0.0537, loss_rpn_bbox: 0.0783, loss_cls: 0.3067, acc: 90.3259, loss_bbox: 0.3392, loss_mask: 0.3546, loss: 1.1324 2024-05-28 10:05:07,456 - mmdet - INFO - Epoch [1][2750/7330] lr: 1.000e-04, eta: 21:48:14, time: 0.895, data_time: 0.054, memory: 20914, loss_rpn_cls: 0.0428, loss_rpn_bbox: 0.0685, loss_cls: 0.2949, acc: 90.8992, loss_bbox: 0.3258, loss_mask: 0.3400, loss: 1.0720 2024-05-28 10:05:52,611 - mmdet - INFO - Epoch [1][2800/7330] lr: 1.000e-04, eta: 21:47:01, time: 0.903, data_time: 0.058, memory: 20914, loss_rpn_cls: 0.0471, loss_rpn_bbox: 0.0739, loss_cls: 0.3062, acc: 90.3147, loss_bbox: 0.3463, loss_mask: 0.3454, loss: 1.1189 2024-05-28 10:06:39,019 - mmdet - INFO - Epoch [1][2850/7330] lr: 1.000e-04, eta: 21:46:26, time: 0.928, data_time: 0.069, memory: 20914, loss_rpn_cls: 0.0486, loss_rpn_bbox: 0.0754, loss_cls: 0.3045, acc: 90.2744, loss_bbox: 0.3420, loss_mask: 0.3445, loss: 1.1151 2024-05-28 10:07:23,880 - mmdet - INFO - Epoch [1][2900/7330] lr: 1.000e-04, eta: 21:45:05, time: 0.897, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0459, loss_rpn_bbox: 0.0703, loss_cls: 0.2876, acc: 90.8796, loss_bbox: 0.3252, loss_mask: 0.3374, loss: 1.0665 2024-05-28 10:08:09,391 - mmdet - INFO - Epoch [1][2950/7330] lr: 1.000e-04, eta: 21:44:04, time: 0.910, data_time: 0.062, memory: 20914, loss_rpn_cls: 0.0487, loss_rpn_bbox: 0.0731, loss_cls: 0.3117, acc: 90.0603, loss_bbox: 0.3493, loss_mask: 0.3442, loss: 1.1269 2024-05-28 10:09:06,213 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 10:09:06,213 - mmdet - INFO - Epoch [1][3000/7330] lr: 1.000e-04, eta: 21:48:24, time: 1.136, data_time: 0.056, memory: 20914, loss_rpn_cls: 0.0463, loss_rpn_bbox: 0.0705, loss_cls: 0.2982, acc: 90.3760, loss_bbox: 0.3429, loss_mask: 0.3460, loss: 1.1039 2024-05-28 10:09:51,638 - mmdet - INFO - Epoch [1][3050/7330] lr: 1.000e-04, eta: 21:47:16, time: 0.908, data_time: 0.038, memory: 20914, loss_rpn_cls: 0.0520, loss_rpn_bbox: 0.0713, loss_cls: 0.3056, acc: 90.1697, loss_bbox: 0.3469, loss_mask: 0.3483, loss: 1.1241 2024-05-28 10:10:37,611 - mmdet - INFO - Epoch [1][3100/7330] lr: 1.000e-04, eta: 21:46:24, time: 0.919, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0529, loss_rpn_bbox: 0.0749, loss_cls: 0.2970, acc: 90.5491, loss_bbox: 0.3382, loss_mask: 0.3447, loss: 1.1077 2024-05-28 10:11:22,831 - mmdet - INFO - Epoch [1][3150/7330] lr: 1.000e-04, eta: 21:45:12, time: 0.904, data_time: 0.058, memory: 20914, loss_rpn_cls: 0.0461, loss_rpn_bbox: 0.0704, loss_cls: 0.3023, acc: 90.3413, loss_bbox: 0.3386, loss_mask: 0.3416, loss: 1.0991 2024-05-28 10:12:08,029 - mmdet - INFO - Epoch [1][3200/7330] lr: 1.000e-04, eta: 21:44:00, time: 0.904, data_time: 0.058, memory: 20914, loss_rpn_cls: 0.0464, loss_rpn_bbox: 0.0696, loss_cls: 0.3046, acc: 90.2375, loss_bbox: 0.3455, loss_mask: 0.3421, loss: 1.1081 2024-05-28 10:12:52,857 - mmdet - INFO - Epoch [1][3250/7330] lr: 1.000e-04, eta: 21:42:39, time: 0.897, data_time: 0.042, memory: 20914, loss_rpn_cls: 0.0420, loss_rpn_bbox: 0.0662, loss_cls: 0.2805, acc: 91.0557, loss_bbox: 0.3183, loss_mask: 0.3308, loss: 1.0378 2024-05-28 10:13:38,256 - mmdet - INFO - Epoch [1][3300/7330] lr: 1.000e-04, eta: 21:41:34, time: 0.908, data_time: 0.057, memory: 20914, loss_rpn_cls: 0.0496, loss_rpn_bbox: 0.0742, loss_cls: 0.3075, acc: 90.2944, loss_bbox: 0.3480, loss_mask: 0.3485, loss: 1.1278 2024-05-28 10:14:24,306 - mmdet - INFO - Epoch [1][3350/7330] lr: 1.000e-04, eta: 21:40:46, time: 0.921, data_time: 0.054, memory: 20914, loss_rpn_cls: 0.0472, loss_rpn_bbox: 0.0723, loss_cls: 0.3044, acc: 90.2507, loss_bbox: 0.3441, loss_mask: 0.3381, loss: 1.1061 2024-05-28 10:15:09,899 - mmdet - INFO - Epoch [1][3400/7330] lr: 1.000e-04, eta: 21:39:47, time: 0.912, data_time: 0.056, memory: 20914, loss_rpn_cls: 0.0461, loss_rpn_bbox: 0.0687, loss_cls: 0.3025, acc: 90.5491, loss_bbox: 0.3364, loss_mask: 0.3326, loss: 1.0862 2024-05-28 10:15:57,902 - mmdet - INFO - Epoch [1][3450/7330] lr: 1.000e-04, eta: 21:39:47, time: 0.960, data_time: 0.056, memory: 20914, loss_rpn_cls: 0.0525, loss_rpn_bbox: 0.0747, loss_cls: 0.3122, acc: 89.9158, loss_bbox: 0.3597, loss_mask: 0.3431, loss: 1.1422 2024-05-28 10:16:43,643 - mmdet - INFO - Epoch [1][3500/7330] lr: 1.000e-04, eta: 21:38:51, time: 0.915, data_time: 0.056, memory: 20914, loss_rpn_cls: 0.0456, loss_rpn_bbox: 0.0710, loss_cls: 0.2863, acc: 90.6021, loss_bbox: 0.3379, loss_mask: 0.3365, loss: 1.0773 2024-05-28 10:17:29,491 - mmdet - INFO - Epoch [1][3550/7330] lr: 1.000e-04, eta: 21:37:58, time: 0.917, data_time: 0.058, memory: 20914, loss_rpn_cls: 0.0487, loss_rpn_bbox: 0.0701, loss_cls: 0.2971, acc: 90.4514, loss_bbox: 0.3445, loss_mask: 0.3335, loss: 1.0939 2024-05-28 10:18:14,913 - mmdet - INFO - Epoch [1][3600/7330] lr: 1.000e-04, eta: 21:36:55, time: 0.908, data_time: 0.060, memory: 20914, loss_rpn_cls: 0.0473, loss_rpn_bbox: 0.0746, loss_cls: 0.3043, acc: 89.9597, loss_bbox: 0.3572, loss_mask: 0.3398, loss: 1.1232 2024-05-28 10:19:00,191 - mmdet - INFO - Epoch [1][3650/7330] lr: 1.000e-04, eta: 21:35:50, time: 0.906, data_time: 0.046, memory: 20914, loss_rpn_cls: 0.0401, loss_rpn_bbox: 0.0664, loss_cls: 0.2961, acc: 90.4077, loss_bbox: 0.3387, loss_mask: 0.3339, loss: 1.0753 2024-05-28 10:19:46,022 - mmdet - INFO - Epoch [1][3700/7330] lr: 1.000e-04, eta: 21:34:57, time: 0.917, data_time: 0.052, memory: 20914, loss_rpn_cls: 0.0430, loss_rpn_bbox: 0.0707, loss_cls: 0.2926, acc: 90.5212, loss_bbox: 0.3465, loss_mask: 0.3353, loss: 1.0881 2024-05-28 10:20:31,147 - mmdet - INFO - Epoch [1][3750/7330] lr: 1.000e-04, eta: 21:33:49, time: 0.902, data_time: 0.043, memory: 20914, loss_rpn_cls: 0.0451, loss_rpn_bbox: 0.0716, loss_cls: 0.2928, acc: 90.5354, loss_bbox: 0.3362, loss_mask: 0.3325, loss: 1.0781 2024-05-28 10:21:16,235 - mmdet - INFO - Epoch [1][3800/7330] lr: 1.000e-04, eta: 21:32:41, time: 0.902, data_time: 0.047, memory: 20914, loss_rpn_cls: 0.0428, loss_rpn_bbox: 0.0669, loss_cls: 0.2874, acc: 90.5723, loss_bbox: 0.3371, loss_mask: 0.3329, loss: 1.0671 2024-05-28 10:22:01,371 - mmdet - INFO - Epoch [1][3850/7330] lr: 1.000e-04, eta: 21:31:34, time: 0.903, data_time: 0.050, memory: 20914, loss_rpn_cls: 0.0465, loss_rpn_bbox: 0.0697, loss_cls: 0.2974, acc: 90.2449, loss_bbox: 0.3476, loss_mask: 0.3342, loss: 1.0953 2024-05-28 10:22:59,094 - mmdet - INFO - Epoch [1][3900/7330] lr: 1.000e-04, eta: 21:34:59, time: 1.154, data_time: 0.054, memory: 20914, loss_rpn_cls: 0.0414, loss_rpn_bbox: 0.0665, loss_cls: 0.2852, acc: 90.7878, loss_bbox: 0.3263, loss_mask: 0.3244, loss: 1.0438 2024-05-28 10:23:44,385 - mmdet - INFO - Epoch [1][3950/7330] lr: 1.000e-04, eta: 21:33:53, time: 0.906, data_time: 0.045, memory: 20914, loss_rpn_cls: 0.0442, loss_rpn_bbox: 0.0708, loss_cls: 0.2987, acc: 90.4783, loss_bbox: 0.3389, loss_mask: 0.3273, loss: 1.0799 2024-05-28 10:24:29,599 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 10:24:29,599 - mmdet - INFO - Epoch [1][4000/7330] lr: 1.000e-04, eta: 21:32:46, time: 0.904, data_time: 0.047, memory: 20914, loss_rpn_cls: 0.0442, loss_rpn_bbox: 0.0681, loss_cls: 0.2986, acc: 90.4521, loss_bbox: 0.3401, loss_mask: 0.3284, loss: 1.0794 2024-05-28 10:25:15,188 - mmdet - INFO - Epoch [1][4050/7330] lr: 1.000e-04, eta: 21:31:47, time: 0.912, data_time: 0.059, memory: 20914, loss_rpn_cls: 0.0439, loss_rpn_bbox: 0.0750, loss_cls: 0.2988, acc: 90.3596, loss_bbox: 0.3405, loss_mask: 0.3299, loss: 1.0881 2024-05-28 10:26:00,262 - mmdet - INFO - Epoch [1][4100/7330] lr: 1.000e-04, eta: 21:30:39, time: 0.902, data_time: 0.064, memory: 20914, loss_rpn_cls: 0.0458, loss_rpn_bbox: 0.0711, loss_cls: 0.2915, acc: 90.6152, loss_bbox: 0.3300, loss_mask: 0.3209, loss: 1.0593 2024-05-28 10:26:46,184 - mmdet - INFO - Epoch [1][4150/7330] lr: 1.000e-04, eta: 21:29:47, time: 0.918, data_time: 0.046, memory: 20914, loss_rpn_cls: 0.0413, loss_rpn_bbox: 0.0640, loss_cls: 0.2918, acc: 90.6396, loss_bbox: 0.3342, loss_mask: 0.3171, loss: 1.0484 2024-05-28 10:27:31,943 - mmdet - INFO - Epoch [1][4200/7330] lr: 1.000e-04, eta: 21:28:53, time: 0.915, data_time: 0.058, memory: 20914, loss_rpn_cls: 0.0405, loss_rpn_bbox: 0.0673, loss_cls: 0.2997, acc: 90.2373, loss_bbox: 0.3410, loss_mask: 0.3261, loss: 1.0746 2024-05-28 10:28:17,780 - mmdet - INFO - Epoch [1][4250/7330] lr: 1.000e-04, eta: 21:28:00, time: 0.917, data_time: 0.061, memory: 20914, loss_rpn_cls: 0.0391, loss_rpn_bbox: 0.0663, loss_cls: 0.2931, acc: 90.2947, loss_bbox: 0.3420, loss_mask: 0.3250, loss: 1.0656 2024-05-28 10:29:03,055 - mmdet - INFO - Epoch [1][4300/7330] lr: 1.000e-04, eta: 21:26:57, time: 0.905, data_time: 0.050, memory: 20914, loss_rpn_cls: 0.0425, loss_rpn_bbox: 0.0683, loss_cls: 0.2977, acc: 90.2808, loss_bbox: 0.3433, loss_mask: 0.3213, loss: 1.0732 2024-05-28 10:29:50,380 - mmdet - INFO - Epoch [1][4350/7330] lr: 1.000e-04, eta: 21:26:33, time: 0.946, data_time: 0.056, memory: 20914, loss_rpn_cls: 0.0419, loss_rpn_bbox: 0.0687, loss_cls: 0.2746, acc: 91.0044, loss_bbox: 0.3141, loss_mask: 0.3161, loss: 1.0154 2024-05-28 10:30:35,300 - mmdet - INFO - Epoch [1][4400/7330] lr: 1.000e-04, eta: 21:25:24, time: 0.898, data_time: 0.046, memory: 20914, loss_rpn_cls: 0.0413, loss_rpn_bbox: 0.0652, loss_cls: 0.2669, acc: 91.1389, loss_bbox: 0.3139, loss_mask: 0.3244, loss: 1.0117 2024-05-28 10:31:20,908 - mmdet - INFO - Epoch [1][4450/7330] lr: 1.000e-04, eta: 21:24:27, time: 0.912, data_time: 0.064, memory: 20914, loss_rpn_cls: 0.0438, loss_rpn_bbox: 0.0666, loss_cls: 0.2862, acc: 90.7390, loss_bbox: 0.3277, loss_mask: 0.3277, loss: 1.0520 2024-05-28 10:32:06,431 - mmdet - INFO - Epoch [1][4500/7330] lr: 1.000e-04, eta: 21:23:30, time: 0.910, data_time: 0.045, memory: 20914, loss_rpn_cls: 0.0424, loss_rpn_bbox: 0.0686, loss_cls: 0.2839, acc: 90.6223, loss_bbox: 0.3333, loss_mask: 0.3209, loss: 1.0491 2024-05-28 10:32:51,614 - mmdet - INFO - Epoch [1][4550/7330] lr: 1.000e-04, eta: 21:22:26, time: 0.904, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0433, loss_rpn_bbox: 0.0708, loss_cls: 0.2925, acc: 90.5852, loss_bbox: 0.3318, loss_mask: 0.3206, loss: 1.0590 2024-05-28 10:33:37,298 - mmdet - INFO - Epoch [1][4600/7330] lr: 1.000e-04, eta: 21:21:32, time: 0.914, data_time: 0.044, memory: 20914, loss_rpn_cls: 0.0424, loss_rpn_bbox: 0.0649, loss_cls: 0.2912, acc: 90.4114, loss_bbox: 0.3395, loss_mask: 0.3222, loss: 1.0602 2024-05-28 10:34:22,275 - mmdet - INFO - Epoch [1][4650/7330] lr: 1.000e-04, eta: 21:20:25, time: 0.900, data_time: 0.046, memory: 20914, loss_rpn_cls: 0.0396, loss_rpn_bbox: 0.0624, loss_cls: 0.2727, acc: 91.0076, loss_bbox: 0.3175, loss_mask: 0.3142, loss: 1.0064 2024-05-28 10:35:08,170 - mmdet - INFO - Epoch [1][4700/7330] lr: 1.000e-04, eta: 21:19:36, time: 0.918, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0409, loss_rpn_bbox: 0.0641, loss_cls: 0.2821, acc: 90.5718, loss_bbox: 0.3356, loss_mask: 0.3152, loss: 1.0379 2024-05-28 10:35:52,679 - mmdet - INFO - Epoch [1][4750/7330] lr: 1.000e-04, eta: 21:18:21, time: 0.890, data_time: 0.045, memory: 20914, loss_rpn_cls: 0.0350, loss_rpn_bbox: 0.0616, loss_cls: 0.2665, acc: 91.3159, loss_bbox: 0.3106, loss_mask: 0.3104, loss: 0.9841 2024-05-28 10:36:49,101 - mmdet - INFO - Epoch [1][4800/7330] lr: 1.000e-04, eta: 21:20:34, time: 1.128, data_time: 0.064, memory: 20914, loss_rpn_cls: 0.0393, loss_rpn_bbox: 0.0683, loss_cls: 0.2973, acc: 90.3264, loss_bbox: 0.3399, loss_mask: 0.3212, loss: 1.0660 2024-05-28 10:37:34,935 - mmdet - INFO - Epoch [1][4850/7330] lr: 1.000e-04, eta: 21:19:42, time: 0.917, data_time: 0.061, memory: 20914, loss_rpn_cls: 0.0406, loss_rpn_bbox: 0.0666, loss_cls: 0.2820, acc: 90.7751, loss_bbox: 0.3241, loss_mask: 0.3193, loss: 1.0325 2024-05-28 10:38:21,056 - mmdet - INFO - Epoch [1][4900/7330] lr: 1.000e-04, eta: 21:18:55, time: 0.922, data_time: 0.039, memory: 20914, loss_rpn_cls: 0.0443, loss_rpn_bbox: 0.0712, loss_cls: 0.2965, acc: 90.3713, loss_bbox: 0.3391, loss_mask: 0.3244, loss: 1.0756 2024-05-28 10:39:06,623 - mmdet - INFO - Epoch [1][4950/7330] lr: 1.000e-04, eta: 21:17:58, time: 0.911, data_time: 0.050, memory: 20914, loss_rpn_cls: 0.0394, loss_rpn_bbox: 0.0627, loss_cls: 0.2610, acc: 91.3936, loss_bbox: 0.3047, loss_mask: 0.3028, loss: 0.9706 2024-05-28 10:39:52,421 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 10:39:52,421 - mmdet - INFO - Epoch [1][5000/7330] lr: 1.000e-04, eta: 21:17:05, time: 0.916, data_time: 0.058, memory: 20914, loss_rpn_cls: 0.0417, loss_rpn_bbox: 0.0683, loss_cls: 0.2807, acc: 90.6094, loss_bbox: 0.3402, loss_mask: 0.3159, loss: 1.0469 2024-05-28 10:40:37,095 - mmdet - INFO - Epoch [1][5050/7330] lr: 1.000e-04, eta: 21:15:54, time: 0.893, data_time: 0.038, memory: 20914, loss_rpn_cls: 0.0369, loss_rpn_bbox: 0.0620, loss_cls: 0.2737, acc: 90.9802, loss_bbox: 0.3154, loss_mask: 0.3124, loss: 1.0004 2024-05-28 10:41:21,848 - mmdet - INFO - Epoch [1][5100/7330] lr: 1.000e-04, eta: 21:14:45, time: 0.895, data_time: 0.048, memory: 20914, loss_rpn_cls: 0.0436, loss_rpn_bbox: 0.0651, loss_cls: 0.2736, acc: 90.8708, loss_bbox: 0.3261, loss_mask: 0.3152, loss: 1.0235 2024-05-28 10:42:07,039 - mmdet - INFO - Epoch [1][5150/7330] lr: 1.000e-04, eta: 21:13:43, time: 0.904, data_time: 0.046, memory: 20914, loss_rpn_cls: 0.0384, loss_rpn_bbox: 0.0658, loss_cls: 0.2796, acc: 90.6836, loss_bbox: 0.3340, loss_mask: 0.3170, loss: 1.0347 2024-05-28 10:42:52,489 - mmdet - INFO - Epoch [1][5200/7330] lr: 1.000e-04, eta: 21:12:46, time: 0.909, data_time: 0.050, memory: 20914, loss_rpn_cls: 0.0432, loss_rpn_bbox: 0.0667, loss_cls: 0.2808, acc: 90.8962, loss_bbox: 0.3163, loss_mask: 0.3101, loss: 1.0171 2024-05-28 10:43:40,068 - mmdet - INFO - Epoch [1][5250/7330] lr: 1.000e-04, eta: 21:12:23, time: 0.952, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0401, loss_rpn_bbox: 0.0608, loss_cls: 0.2724, acc: 91.0808, loss_bbox: 0.3217, loss_mask: 0.3070, loss: 1.0021 2024-05-28 10:44:25,660 - mmdet - INFO - Epoch [1][5300/7330] lr: 1.000e-04, eta: 21:11:28, time: 0.912, data_time: 0.050, memory: 20914, loss_rpn_cls: 0.0368, loss_rpn_bbox: 0.0631, loss_cls: 0.2799, acc: 90.7151, loss_bbox: 0.3255, loss_mask: 0.3156, loss: 1.0208 2024-05-28 10:45:11,934 - mmdet - INFO - Epoch [1][5350/7330] lr: 1.000e-04, eta: 21:10:44, time: 0.925, data_time: 0.056, memory: 20914, loss_rpn_cls: 0.0407, loss_rpn_bbox: 0.0702, loss_cls: 0.2844, acc: 90.6448, loss_bbox: 0.3324, loss_mask: 0.3116, loss: 1.0393 2024-05-28 10:45:57,359 - mmdet - INFO - Epoch [1][5400/7330] lr: 1.000e-04, eta: 21:09:47, time: 0.908, data_time: 0.055, memory: 20914, loss_rpn_cls: 0.0387, loss_rpn_bbox: 0.0673, loss_cls: 0.2775, acc: 90.8684, loss_bbox: 0.3275, loss_mask: 0.3153, loss: 1.0263 2024-05-28 10:46:42,704 - mmdet - INFO - Epoch [1][5450/7330] lr: 1.000e-04, eta: 21:08:48, time: 0.907, data_time: 0.061, memory: 20914, loss_rpn_cls: 0.0427, loss_rpn_bbox: 0.0647, loss_cls: 0.2810, acc: 90.6187, loss_bbox: 0.3322, loss_mask: 0.3123, loss: 1.0328 2024-05-28 10:47:27,957 - mmdet - INFO - Epoch [1][5500/7330] lr: 1.000e-04, eta: 21:07:49, time: 0.905, data_time: 0.056, memory: 20914, loss_rpn_cls: 0.0377, loss_rpn_bbox: 0.0656, loss_cls: 0.2862, acc: 90.6326, loss_bbox: 0.3321, loss_mask: 0.3088, loss: 1.0303 2024-05-28 10:48:13,502 - mmdet - INFO - Epoch [1][5550/7330] lr: 1.000e-04, eta: 21:06:54, time: 0.911, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0388, loss_rpn_bbox: 0.0640, loss_cls: 0.2766, acc: 90.9766, loss_bbox: 0.3195, loss_mask: 0.3030, loss: 1.0018 2024-05-28 10:48:58,711 - mmdet - INFO - Epoch [1][5600/7330] lr: 1.000e-04, eta: 21:05:55, time: 0.904, data_time: 0.047, memory: 20914, loss_rpn_cls: 0.0416, loss_rpn_bbox: 0.0627, loss_cls: 0.2819, acc: 90.8032, loss_bbox: 0.3232, loss_mask: 0.3069, loss: 1.0163 2024-05-28 10:49:43,710 - mmdet - INFO - Epoch [1][5650/7330] lr: 1.000e-04, eta: 21:04:52, time: 0.900, data_time: 0.053, memory: 20914, loss_rpn_cls: 0.0396, loss_rpn_bbox: 0.0643, loss_cls: 0.2839, acc: 90.5334, loss_bbox: 0.3270, loss_mask: 0.3117, loss: 1.0265 2024-05-28 10:50:38,660 - mmdet - INFO - Epoch [1][5700/7330] lr: 1.000e-04, eta: 21:06:14, time: 1.099, data_time: 0.046, memory: 20914, loss_rpn_cls: 0.0392, loss_rpn_bbox: 0.0643, loss_cls: 0.2742, acc: 90.8489, loss_bbox: 0.3235, loss_mask: 0.3104, loss: 1.0117 2024-05-28 10:51:26,500 - mmdet - INFO - Epoch [1][5750/7330] lr: 1.000e-04, eta: 21:05:52, time: 0.957, data_time: 0.053, memory: 20914, loss_rpn_cls: 0.0390, loss_rpn_bbox: 0.0638, loss_cls: 0.2824, acc: 90.7263, loss_bbox: 0.3229, loss_mask: 0.3057, loss: 1.0138 2024-05-28 10:52:12,586 - mmdet - INFO - Epoch [1][5800/7330] lr: 1.000e-04, eta: 21:05:04, time: 0.922, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0404, loss_rpn_bbox: 0.0684, loss_cls: 0.2685, acc: 90.9529, loss_bbox: 0.3264, loss_mask: 0.3058, loss: 1.0096 2024-05-28 10:52:57,687 - mmdet - INFO - Epoch [1][5850/7330] lr: 1.000e-04, eta: 21:04:02, time: 0.902, data_time: 0.042, memory: 20914, loss_rpn_cls: 0.0404, loss_rpn_bbox: 0.0641, loss_cls: 0.2657, acc: 91.1707, loss_bbox: 0.3116, loss_mask: 0.3047, loss: 0.9865 2024-05-28 10:53:43,040 - mmdet - INFO - Epoch [1][5900/7330] lr: 1.000e-04, eta: 21:03:05, time: 0.907, data_time: 0.047, memory: 20914, loss_rpn_cls: 0.0358, loss_rpn_bbox: 0.0611, loss_cls: 0.2618, acc: 91.0681, loss_bbox: 0.3213, loss_mask: 0.3131, loss: 0.9931 2024-05-28 10:54:28,420 - mmdet - INFO - Epoch [1][5950/7330] lr: 1.000e-04, eta: 21:02:07, time: 0.908, data_time: 0.046, memory: 20914, loss_rpn_cls: 0.0351, loss_rpn_bbox: 0.0593, loss_cls: 0.2531, acc: 91.4707, loss_bbox: 0.3076, loss_mask: 0.3012, loss: 0.9563 2024-05-28 10:55:14,252 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 10:55:14,253 - mmdet - INFO - Epoch [1][6000/7330] lr: 1.000e-04, eta: 21:01:17, time: 0.917, data_time: 0.061, memory: 20914, loss_rpn_cls: 0.0392, loss_rpn_bbox: 0.0662, loss_cls: 0.2842, acc: 90.5598, loss_bbox: 0.3366, loss_mask: 0.3121, loss: 1.0382 2024-05-28 10:55:59,320 - mmdet - INFO - Epoch [1][6050/7330] lr: 1.000e-04, eta: 21:00:16, time: 0.901, data_time: 0.049, memory: 20914, loss_rpn_cls: 0.0380, loss_rpn_bbox: 0.0662, loss_cls: 0.2651, acc: 91.1465, loss_bbox: 0.3149, loss_mask: 0.3072, loss: 0.9915 2024-05-28 10:56:44,641 - mmdet - INFO - Epoch [1][6100/7330] lr: 1.000e-04, eta: 20:59:18, time: 0.906, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0378, loss_rpn_bbox: 0.0665, loss_cls: 0.2755, acc: 90.8594, loss_bbox: 0.3245, loss_mask: 0.3038, loss: 1.0080 2024-05-28 10:57:32,017 - mmdet - INFO - Epoch [1][6150/7330] lr: 1.000e-04, eta: 20:58:48, time: 0.948, data_time: 0.054, memory: 20914, loss_rpn_cls: 0.0392, loss_rpn_bbox: 0.0646, loss_cls: 0.2795, acc: 90.7437, loss_bbox: 0.3287, loss_mask: 0.3067, loss: 1.0187 2024-05-28 10:58:17,152 - mmdet - INFO - Epoch [1][6200/7330] lr: 1.000e-04, eta: 20:57:49, time: 0.903, data_time: 0.055, memory: 20914, loss_rpn_cls: 0.0376, loss_rpn_bbox: 0.0623, loss_cls: 0.2605, acc: 91.3167, loss_bbox: 0.3119, loss_mask: 0.3102, loss: 0.9825 2024-05-28 10:59:02,449 - mmdet - INFO - Epoch [1][6250/7330] lr: 1.000e-04, eta: 20:56:51, time: 0.906, data_time: 0.049, memory: 20914, loss_rpn_cls: 0.0383, loss_rpn_bbox: 0.0630, loss_cls: 0.2656, acc: 91.1057, loss_bbox: 0.3175, loss_mask: 0.3090, loss: 0.9935 2024-05-28 10:59:48,060 - mmdet - INFO - Epoch [1][6300/7330] lr: 1.000e-04, eta: 20:55:58, time: 0.912, data_time: 0.050, memory: 20914, loss_rpn_cls: 0.0369, loss_rpn_bbox: 0.0637, loss_cls: 0.2777, acc: 90.7183, loss_bbox: 0.3192, loss_mask: 0.3068, loss: 1.0043 2024-05-28 11:00:33,770 - mmdet - INFO - Epoch [1][6350/7330] lr: 1.000e-04, eta: 20:55:07, time: 0.914, data_time: 0.064, memory: 20914, loss_rpn_cls: 0.0401, loss_rpn_bbox: 0.0628, loss_cls: 0.2727, acc: 90.9802, loss_bbox: 0.3173, loss_mask: 0.3062, loss: 0.9990 2024-05-28 11:01:19,318 - mmdet - INFO - Epoch [1][6400/7330] lr: 1.000e-04, eta: 20:54:13, time: 0.911, data_time: 0.048, memory: 20914, loss_rpn_cls: 0.0365, loss_rpn_bbox: 0.0613, loss_cls: 0.2767, acc: 90.7144, loss_bbox: 0.3205, loss_mask: 0.3067, loss: 1.0019 2024-05-28 11:02:04,458 - mmdet - INFO - Epoch [1][6450/7330] lr: 1.000e-04, eta: 20:53:14, time: 0.903, data_time: 0.053, memory: 20914, loss_rpn_cls: 0.0377, loss_rpn_bbox: 0.0654, loss_cls: 0.2679, acc: 91.0659, loss_bbox: 0.3157, loss_mask: 0.2975, loss: 0.9840 2024-05-28 11:02:49,767 - mmdet - INFO - Epoch [1][6500/7330] lr: 1.000e-04, eta: 20:52:18, time: 0.906, data_time: 0.055, memory: 20914, loss_rpn_cls: 0.0343, loss_rpn_bbox: 0.0630, loss_cls: 0.2686, acc: 90.9287, loss_bbox: 0.3184, loss_mask: 0.3051, loss: 0.9894 2024-05-28 11:03:35,162 - mmdet - INFO - Epoch [1][6550/7330] lr: 1.000e-04, eta: 20:51:23, time: 0.908, data_time: 0.060, memory: 20914, loss_rpn_cls: 0.0378, loss_rpn_bbox: 0.0628, loss_cls: 0.2763, acc: 90.7942, loss_bbox: 0.3264, loss_mask: 0.3115, loss: 1.0148 2024-05-28 11:04:27,479 - mmdet - INFO - Epoch [1][6600/7330] lr: 1.000e-04, eta: 20:51:53, time: 1.046, data_time: 0.054, memory: 20914, loss_rpn_cls: 0.0376, loss_rpn_bbox: 0.0605, loss_cls: 0.2529, acc: 91.4556, loss_bbox: 0.3043, loss_mask: 0.2968, loss: 0.9522 2024-05-28 11:05:17,055 - mmdet - INFO - Epoch [1][6650/7330] lr: 1.000e-04, eta: 20:51:49, time: 0.992, data_time: 0.053, memory: 20914, loss_rpn_cls: 0.0369, loss_rpn_bbox: 0.0601, loss_cls: 0.2592, acc: 91.3079, loss_bbox: 0.3070, loss_mask: 0.2982, loss: 0.9614 2024-05-28 11:06:02,532 - mmdet - INFO - Epoch [1][6700/7330] lr: 1.000e-04, eta: 20:50:54, time: 0.910, data_time: 0.043, memory: 20914, loss_rpn_cls: 0.0364, loss_rpn_bbox: 0.0578, loss_cls: 0.2541, acc: 91.5342, loss_bbox: 0.3068, loss_mask: 0.2995, loss: 0.9546 2024-05-28 11:06:47,899 - mmdet - INFO - Epoch [1][6750/7330] lr: 1.000e-04, eta: 20:49:58, time: 0.907, data_time: 0.054, memory: 20914, loss_rpn_cls: 0.0359, loss_rpn_bbox: 0.0599, loss_cls: 0.2530, acc: 91.5251, loss_bbox: 0.2969, loss_mask: 0.2949, loss: 0.9406 2024-05-28 11:07:34,071 - mmdet - INFO - Epoch [1][6800/7330] lr: 1.000e-04, eta: 20:49:12, time: 0.924, data_time: 0.065, memory: 20914, loss_rpn_cls: 0.0352, loss_rpn_bbox: 0.0648, loss_cls: 0.2562, acc: 91.3196, loss_bbox: 0.3121, loss_mask: 0.3053, loss: 0.9736 2024-05-28 11:08:19,142 - mmdet - INFO - Epoch [1][6850/7330] lr: 1.000e-04, eta: 20:48:12, time: 0.901, data_time: 0.052, memory: 20914, loss_rpn_cls: 0.0373, loss_rpn_bbox: 0.0633, loss_cls: 0.2710, acc: 90.9939, loss_bbox: 0.3205, loss_mask: 0.3049, loss: 0.9971 2024-05-28 11:09:04,539 - mmdet - INFO - Epoch [1][6900/7330] lr: 1.000e-04, eta: 20:47:17, time: 0.908, data_time: 0.056, memory: 20914, loss_rpn_cls: 0.0386, loss_rpn_bbox: 0.0632, loss_cls: 0.2698, acc: 91.0737, loss_bbox: 0.3146, loss_mask: 0.3064, loss: 0.9925 2024-05-28 11:09:49,875 - mmdet - INFO - Epoch [1][6950/7330] lr: 1.000e-04, eta: 20:46:21, time: 0.907, data_time: 0.059, memory: 20914, loss_rpn_cls: 0.0387, loss_rpn_bbox: 0.0602, loss_cls: 0.2633, acc: 91.2612, loss_bbox: 0.3097, loss_mask: 0.3058, loss: 0.9777 2024-05-28 11:10:35,343 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 11:10:35,343 - mmdet - INFO - Epoch [1][7000/7330] lr: 1.000e-04, eta: 20:45:27, time: 0.909, data_time: 0.061, memory: 20914, loss_rpn_cls: 0.0377, loss_rpn_bbox: 0.0585, loss_cls: 0.2520, acc: 91.5574, loss_bbox: 0.2954, loss_mask: 0.2951, loss: 0.9387 2024-05-28 11:11:23,066 - mmdet - INFO - Epoch [1][7050/7330] lr: 1.000e-04, eta: 20:44:59, time: 0.954, data_time: 0.042, memory: 20914, loss_rpn_cls: 0.0376, loss_rpn_bbox: 0.0625, loss_cls: 0.2628, acc: 91.2485, loss_bbox: 0.3093, loss_mask: 0.3023, loss: 0.9745 2024-05-28 11:12:08,839 - mmdet - INFO - Epoch [1][7100/7330] lr: 1.000e-04, eta: 20:44:08, time: 0.915, data_time: 0.070, memory: 20914, loss_rpn_cls: 0.0372, loss_rpn_bbox: 0.0613, loss_cls: 0.2754, acc: 90.7417, loss_bbox: 0.3237, loss_mask: 0.3125, loss: 1.0100 2024-05-28 11:12:54,394 - mmdet - INFO - Epoch [1][7150/7330] lr: 1.000e-04, eta: 20:43:15, time: 0.911, data_time: 0.063, memory: 20914, loss_rpn_cls: 0.0361, loss_rpn_bbox: 0.0622, loss_cls: 0.2654, acc: 91.0632, loss_bbox: 0.3179, loss_mask: 0.3072, loss: 0.9888 2024-05-28 11:13:39,553 - mmdet - INFO - Epoch [1][7200/7330] lr: 1.000e-04, eta: 20:42:18, time: 0.903, data_time: 0.054, memory: 20914, loss_rpn_cls: 0.0394, loss_rpn_bbox: 0.0647, loss_cls: 0.2639, acc: 90.9866, loss_bbox: 0.3163, loss_mask: 0.2986, loss: 0.9829 2024-05-28 11:14:25,324 - mmdet - INFO - Epoch [1][7250/7330] lr: 1.000e-04, eta: 20:41:28, time: 0.915, data_time: 0.064, memory: 20914, loss_rpn_cls: 0.0365, loss_rpn_bbox: 0.0591, loss_cls: 0.2549, acc: 91.4705, loss_bbox: 0.2981, loss_mask: 0.2937, loss: 0.9424 2024-05-28 11:15:10,725 - mmdet - INFO - Epoch [1][7300/7330] lr: 1.000e-04, eta: 20:40:33, time: 0.908, data_time: 0.051, memory: 20914, loss_rpn_cls: 0.0359, loss_rpn_bbox: 0.0643, loss_cls: 0.2675, acc: 91.0696, loss_bbox: 0.3160, loss_mask: 0.3042, loss: 0.9880 2024-05-28 11:15:39,340 - mmdet - INFO - Saving checkpoint at 1 epochs 2024-05-28 11:17:53,612 - mmdet - INFO - Evaluating bbox... 2024-05-28 11:18:23,135 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.317 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.578 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.317 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.183 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.350 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.446 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.448 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.448 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.448 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.265 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.491 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.599 2024-05-28 11:18:23,135 - mmdet - INFO - Evaluating segm... 2024-05-28 11:18:54,911 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.308 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.539 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.312 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.129 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.338 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.497 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.425 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.425 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.425 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.213 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.474 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.611 2024-05-28 11:18:55,335 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 11:18:55,336 - mmdet - INFO - Epoch(val) [1][625] bbox_mAP: 0.3170, bbox_mAP_50: 0.5780, bbox_mAP_75: 0.3170, bbox_mAP_s: 0.1830, bbox_mAP_m: 0.3500, bbox_mAP_l: 0.4460, bbox_mAP_copypaste: 0.317 0.578 0.317 0.183 0.350 0.446, segm_mAP: 0.3080, segm_mAP_50: 0.5390, segm_mAP_75: 0.3120, segm_mAP_s: 0.1290, segm_mAP_m: 0.3380, segm_mAP_l: 0.4970, segm_mAP_copypaste: 0.308 0.539 0.312 0.129 0.338 0.497 2024-05-28 11:19:51,982 - mmdet - INFO - Epoch [2][50/7330] lr: 1.000e-04, eta: 20:36:12, time: 1.133, data_time: 0.125, memory: 20925, loss_rpn_cls: 0.0359, loss_rpn_bbox: 0.0619, loss_cls: 0.2572, acc: 91.1829, loss_bbox: 0.3144, loss_mask: 0.2993, loss: 0.9687 2024-05-28 11:20:40,576 - mmdet - INFO - Epoch [2][100/7330] lr: 1.000e-04, eta: 20:35:54, time: 0.972, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0357, loss_rpn_bbox: 0.0621, loss_cls: 0.2590, acc: 91.1274, loss_bbox: 0.3153, loss_mask: 0.2954, loss: 0.9675 2024-05-28 11:21:26,319 - mmdet - INFO - Epoch [2][150/7330] lr: 1.000e-04, eta: 20:35:04, time: 0.915, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0333, loss_rpn_bbox: 0.0580, loss_cls: 0.2500, acc: 91.4446, loss_bbox: 0.2990, loss_mask: 0.2957, loss: 0.9360 2024-05-28 11:22:11,817 - mmdet - INFO - Epoch [2][200/7330] lr: 1.000e-04, eta: 20:34:13, time: 0.910, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0313, loss_rpn_bbox: 0.0585, loss_cls: 0.2506, acc: 91.5881, loss_bbox: 0.3011, loss_mask: 0.3024, loss: 0.9439 2024-05-28 11:22:57,474 - mmdet - INFO - Epoch [2][250/7330] lr: 1.000e-04, eta: 20:33:23, time: 0.913, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0573, loss_cls: 0.2603, acc: 91.3887, loss_bbox: 0.3044, loss_mask: 0.2917, loss: 0.9449 2024-05-28 11:23:42,613 - mmdet - INFO - Epoch [2][300/7330] lr: 1.000e-04, eta: 20:32:27, time: 0.903, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0340, loss_rpn_bbox: 0.0644, loss_cls: 0.2554, acc: 91.3647, loss_bbox: 0.3048, loss_mask: 0.2969, loss: 0.9555 2024-05-28 11:24:28,754 - mmdet - INFO - Epoch [2][350/7330] lr: 1.000e-04, eta: 20:31:42, time: 0.923, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0596, loss_cls: 0.2467, acc: 91.3970, loss_bbox: 0.3036, loss_mask: 0.3001, loss: 0.9436 2024-05-28 11:25:15,164 - mmdet - INFO - Epoch [2][400/7330] lr: 1.000e-04, eta: 20:31:00, time: 0.928, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0356, loss_rpn_bbox: 0.0609, loss_cls: 0.2654, acc: 90.9368, loss_bbox: 0.3222, loss_mask: 0.2970, loss: 0.9812 2024-05-28 11:26:00,511 - mmdet - INFO - Epoch [2][450/7330] lr: 1.000e-04, eta: 20:30:07, time: 0.907, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0319, loss_rpn_bbox: 0.0603, loss_cls: 0.2562, acc: 91.2874, loss_bbox: 0.3123, loss_mask: 0.2938, loss: 0.9544 2024-05-28 11:26:46,678 - mmdet - INFO - Epoch [2][500/7330] lr: 1.000e-04, eta: 20:29:22, time: 0.923, data_time: 0.037, memory: 20925, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0590, loss_cls: 0.2524, acc: 91.3921, loss_bbox: 0.3036, loss_mask: 0.2936, loss: 0.9423 2024-05-28 11:27:33,129 - mmdet - INFO - Epoch [2][550/7330] lr: 1.000e-04, eta: 20:28:41, time: 0.929, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0389, loss_rpn_bbox: 0.0648, loss_cls: 0.2625, acc: 90.8662, loss_bbox: 0.3178, loss_mask: 0.3056, loss: 0.9897 2024-05-28 11:28:19,566 - mmdet - INFO - Epoch [2][600/7330] lr: 1.000e-04, eta: 20:27:59, time: 0.929, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0339, loss_rpn_bbox: 0.0633, loss_cls: 0.2526, acc: 91.2957, loss_bbox: 0.3071, loss_mask: 0.2916, loss: 0.9485 2024-05-28 11:29:12,425 - mmdet - INFO - Epoch [2][650/7330] lr: 1.000e-04, eta: 20:28:21, time: 1.057, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0337, loss_rpn_bbox: 0.0597, loss_cls: 0.2509, acc: 91.5308, loss_bbox: 0.3015, loss_mask: 0.2911, loss: 0.9369 2024-05-28 11:30:00,544 - mmdet - INFO - Epoch [2][700/7330] lr: 1.000e-04, eta: 20:27:55, time: 0.962, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0350, loss_rpn_bbox: 0.0626, loss_cls: 0.2545, acc: 91.2366, loss_bbox: 0.3065, loss_mask: 0.2959, loss: 0.9544 2024-05-28 11:30:48,243 - mmdet - INFO - Epoch [2][750/7330] lr: 1.000e-04, eta: 20:27:25, time: 0.954, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0304, loss_rpn_bbox: 0.0565, loss_cls: 0.2415, acc: 91.7336, loss_bbox: 0.2926, loss_mask: 0.2889, loss: 0.9098 2024-05-28 11:31:38,411 - mmdet - INFO - Epoch [2][800/7330] lr: 1.000e-04, eta: 20:27:19, time: 1.003, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0326, loss_rpn_bbox: 0.0581, loss_cls: 0.2562, acc: 91.3120, loss_bbox: 0.3046, loss_mask: 0.2907, loss: 0.9422 2024-05-28 11:32:24,249 - mmdet - INFO - Epoch [2][850/7330] lr: 1.000e-04, eta: 20:26:30, time: 0.917, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0352, loss_rpn_bbox: 0.0589, loss_cls: 0.2605, acc: 91.2812, loss_bbox: 0.3037, loss_mask: 0.2903, loss: 0.9485 2024-05-28 11:33:10,109 - mmdet - INFO - Epoch [2][900/7330] lr: 1.000e-04, eta: 20:25:42, time: 0.917, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0600, loss_cls: 0.2519, acc: 91.3672, loss_bbox: 0.3118, loss_mask: 0.2936, loss: 0.9521 2024-05-28 11:33:55,858 - mmdet - INFO - Epoch [2][950/7330] lr: 1.000e-04, eta: 20:24:52, time: 0.915, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0566, loss_cls: 0.2472, acc: 91.5068, loss_bbox: 0.2984, loss_mask: 0.2889, loss: 0.9228 2024-05-28 11:34:41,767 - mmdet - INFO - Epoch [2][1000/7330] lr: 1.000e-04, eta: 20:24:04, time: 0.918, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0324, loss_rpn_bbox: 0.0579, loss_cls: 0.2580, acc: 91.1746, loss_bbox: 0.3097, loss_mask: 0.2976, loss: 0.9555 2024-05-28 11:35:27,686 - mmdet - INFO - Epoch [2][1050/7330] lr: 1.000e-04, eta: 20:23:16, time: 0.918, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0327, loss_rpn_bbox: 0.0585, loss_cls: 0.2588, acc: 91.2290, loss_bbox: 0.3066, loss_mask: 0.2892, loss: 0.9458 2024-05-28 11:36:14,117 - mmdet - INFO - Epoch [2][1100/7330] lr: 1.000e-04, eta: 20:22:33, time: 0.929, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0342, loss_rpn_bbox: 0.0621, loss_cls: 0.2551, acc: 91.2510, loss_bbox: 0.3073, loss_mask: 0.2996, loss: 0.9584 2024-05-28 11:37:00,527 - mmdet - INFO - Epoch [2][1150/7330] lr: 1.000e-04, eta: 20:21:49, time: 0.928, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0372, loss_rpn_bbox: 0.0630, loss_cls: 0.2565, acc: 91.2607, loss_bbox: 0.3084, loss_mask: 0.3007, loss: 0.9658 2024-05-28 11:37:48,838 - mmdet - INFO - Epoch [2][1200/7330] lr: 1.000e-04, eta: 20:21:24, time: 0.966, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0580, loss_cls: 0.2413, acc: 91.7283, loss_bbox: 0.2892, loss_mask: 0.2863, loss: 0.9084 2024-05-28 11:38:34,974 - mmdet - INFO - Epoch [2][1250/7330] lr: 1.000e-04, eta: 20:20:37, time: 0.923, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0601, loss_cls: 0.2471, acc: 91.5935, loss_bbox: 0.2957, loss_mask: 0.2842, loss: 0.9194 2024-05-28 11:39:20,720 - mmdet - INFO - Epoch [2][1300/7330] lr: 1.000e-04, eta: 20:19:48, time: 0.915, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0588, loss_cls: 0.2468, acc: 91.5291, loss_bbox: 0.2973, loss_mask: 0.2869, loss: 0.9216 2024-05-28 11:40:06,854 - mmdet - INFO - Epoch [2][1350/7330] lr: 1.000e-04, eta: 20:19:02, time: 0.923, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0634, loss_cls: 0.2533, acc: 91.3562, loss_bbox: 0.3052, loss_mask: 0.2950, loss: 0.9507 2024-05-28 11:40:52,317 - mmdet - INFO - Epoch [2][1400/7330] lr: 1.000e-04, eta: 20:18:10, time: 0.909, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0322, loss_rpn_bbox: 0.0593, loss_cls: 0.2584, acc: 91.1738, loss_bbox: 0.3091, loss_mask: 0.2919, loss: 0.9507 2024-05-28 11:41:37,834 - mmdet - INFO - Epoch [2][1450/7330] lr: 1.000e-04, eta: 20:17:18, time: 0.910, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0587, loss_cls: 0.2406, acc: 91.7996, loss_bbox: 0.2918, loss_mask: 0.2898, loss: 0.9154 2024-05-28 11:42:23,876 - mmdet - INFO - Epoch [2][1500/7330] lr: 1.000e-04, eta: 20:16:31, time: 0.921, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0314, loss_rpn_bbox: 0.0583, loss_cls: 0.2487, acc: 91.5120, loss_bbox: 0.3003, loss_mask: 0.2896, loss: 0.9283 2024-05-28 11:43:09,955 - mmdet - INFO - Epoch [2][1550/7330] lr: 1.000e-04, eta: 20:15:45, time: 0.922, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0554, loss_cls: 0.2464, acc: 91.5845, loss_bbox: 0.3006, loss_mask: 0.2905, loss: 0.9248 2024-05-28 11:43:56,172 - mmdet - INFO - Epoch [2][1600/7330] lr: 1.000e-04, eta: 20:14:59, time: 0.924, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0348, loss_rpn_bbox: 0.0626, loss_cls: 0.2563, acc: 91.2378, loss_bbox: 0.3096, loss_mask: 0.3028, loss: 0.9660 2024-05-28 11:44:42,462 - mmdet - INFO - Epoch [2][1650/7330] lr: 1.000e-04, eta: 20:14:15, time: 0.926, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0322, loss_rpn_bbox: 0.0604, loss_cls: 0.2502, acc: 91.3943, loss_bbox: 0.3064, loss_mask: 0.2942, loss: 0.9434 2024-05-28 11:45:28,987 - mmdet - INFO - Epoch [2][1700/7330] lr: 1.000e-04, eta: 20:13:32, time: 0.930, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0613, loss_cls: 0.2507, acc: 91.5105, loss_bbox: 0.3013, loss_mask: 0.2871, loss: 0.9350 2024-05-28 11:46:21,748 - mmdet - INFO - Epoch [2][1750/7330] lr: 1.000e-04, eta: 20:13:44, time: 1.055, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0601, loss_cls: 0.2579, acc: 91.1799, loss_bbox: 0.3088, loss_mask: 0.2870, loss: 0.9485 2024-05-28 11:47:09,869 - mmdet - INFO - Epoch [2][1800/7330] lr: 1.000e-04, eta: 20:13:14, time: 0.962, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0341, loss_rpn_bbox: 0.0593, loss_cls: 0.2454, acc: 91.6711, loss_bbox: 0.2948, loss_mask: 0.2892, loss: 0.9228 2024-05-28 11:48:01,985 - mmdet - INFO - Epoch [2][1850/7330] lr: 1.000e-04, eta: 20:13:19, time: 1.042, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0337, loss_rpn_bbox: 0.0581, loss_cls: 0.2515, acc: 91.3979, loss_bbox: 0.3020, loss_mask: 0.2935, loss: 0.9388 2024-05-28 11:48:48,100 - mmdet - INFO - Epoch [2][1900/7330] lr: 1.000e-04, eta: 20:12:32, time: 0.922, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0343, loss_rpn_bbox: 0.0594, loss_cls: 0.2531, acc: 91.3640, loss_bbox: 0.3053, loss_mask: 0.2902, loss: 0.9422 2024-05-28 11:49:33,935 - mmdet - INFO - Epoch [2][1950/7330] lr: 1.000e-04, eta: 20:11:43, time: 0.917, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0314, loss_rpn_bbox: 0.0617, loss_cls: 0.2563, acc: 91.2380, loss_bbox: 0.3102, loss_mask: 0.2912, loss: 0.9507 2024-05-28 11:50:19,705 - mmdet - INFO - Epoch [2][2000/7330] lr: 1.000e-04, eta: 20:10:53, time: 0.915, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0585, loss_cls: 0.2377, acc: 91.7974, loss_bbox: 0.2948, loss_mask: 0.2832, loss: 0.9068 2024-05-28 11:51:04,962 - mmdet - INFO - Epoch [2][2050/7330] lr: 1.000e-04, eta: 20:09:59, time: 0.905, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0321, loss_rpn_bbox: 0.0584, loss_cls: 0.2534, acc: 91.3647, loss_bbox: 0.3029, loss_mask: 0.2917, loss: 0.9385 2024-05-28 11:51:50,275 - mmdet - INFO - Epoch [2][2100/7330] lr: 1.000e-04, eta: 20:09:05, time: 0.906, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0535, loss_cls: 0.2383, acc: 91.7327, loss_bbox: 0.2922, loss_mask: 0.2756, loss: 0.8918 2024-05-28 11:52:36,473 - mmdet - INFO - Epoch [2][2150/7330] lr: 1.000e-04, eta: 20:08:19, time: 0.924, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0341, loss_rpn_bbox: 0.0586, loss_cls: 0.2529, acc: 91.3191, loss_bbox: 0.3035, loss_mask: 0.2900, loss: 0.9391 2024-05-28 11:53:22,389 - mmdet - INFO - Epoch [2][2200/7330] lr: 1.000e-04, eta: 20:07:31, time: 0.918, data_time: 0.063, memory: 20925, loss_rpn_cls: 0.0343, loss_rpn_bbox: 0.0580, loss_cls: 0.2596, acc: 91.2322, loss_bbox: 0.3105, loss_mask: 0.2908, loss: 0.9532 2024-05-28 11:54:11,299 - mmdet - INFO - Epoch [2][2250/7330] lr: 1.000e-04, eta: 20:07:07, time: 0.978, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0315, loss_rpn_bbox: 0.0566, loss_cls: 0.2391, acc: 91.7612, loss_bbox: 0.2890, loss_mask: 0.2852, loss: 0.9014 2024-05-28 11:54:56,984 - mmdet - INFO - Epoch [2][2300/7330] lr: 1.000e-04, eta: 20:06:16, time: 0.914, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0553, loss_cls: 0.2362, acc: 91.8030, loss_bbox: 0.2903, loss_mask: 0.2847, loss: 0.8983 2024-05-28 11:55:43,017 - mmdet - INFO - Epoch [2][2350/7330] lr: 1.000e-04, eta: 20:05:29, time: 0.921, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0322, loss_rpn_bbox: 0.0589, loss_cls: 0.2533, acc: 91.3018, loss_bbox: 0.3059, loss_mask: 0.2865, loss: 0.9368 2024-05-28 11:56:29,005 - mmdet - INFO - Epoch [2][2400/7330] lr: 1.000e-04, eta: 20:04:41, time: 0.920, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0351, loss_rpn_bbox: 0.0604, loss_cls: 0.2522, acc: 91.5210, loss_bbox: 0.3003, loss_mask: 0.2878, loss: 0.9358 2024-05-28 11:57:14,781 - mmdet - INFO - Epoch [2][2450/7330] lr: 1.000e-04, eta: 20:03:51, time: 0.915, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0327, loss_rpn_bbox: 0.0557, loss_cls: 0.2419, acc: 91.6155, loss_bbox: 0.2985, loss_mask: 0.2897, loss: 0.9185 2024-05-28 11:57:59,947 - mmdet - INFO - Epoch [2][2500/7330] lr: 1.000e-04, eta: 20:02:57, time: 0.903, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0315, loss_rpn_bbox: 0.0536, loss_cls: 0.2346, acc: 91.9890, loss_bbox: 0.2806, loss_mask: 0.2836, loss: 0.8840 2024-05-28 11:58:46,096 - mmdet - INFO - Epoch [2][2550/7330] lr: 1.000e-04, eta: 20:02:11, time: 0.923, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0351, loss_rpn_bbox: 0.0581, loss_cls: 0.2467, acc: 91.5518, loss_bbox: 0.2975, loss_mask: 0.2882, loss: 0.9256 2024-05-28 11:59:31,622 - mmdet - INFO - Epoch [2][2600/7330] lr: 1.000e-04, eta: 20:01:19, time: 0.910, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0304, loss_rpn_bbox: 0.0555, loss_cls: 0.2327, acc: 92.1135, loss_bbox: 0.2801, loss_mask: 0.2796, loss: 0.8783 2024-05-28 12:00:17,226 - mmdet - INFO - Epoch [2][2650/7330] lr: 1.000e-04, eta: 20:00:28, time: 0.912, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0312, loss_rpn_bbox: 0.0588, loss_cls: 0.2471, acc: 91.5986, loss_bbox: 0.2958, loss_mask: 0.2857, loss: 0.9186 2024-05-28 12:01:02,795 - mmdet - INFO - Epoch [2][2700/7330] lr: 1.000e-04, eta: 19:59:37, time: 0.911, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0332, loss_rpn_bbox: 0.0617, loss_cls: 0.2546, acc: 91.2876, loss_bbox: 0.3057, loss_mask: 0.2887, loss: 0.9439 2024-05-28 12:01:48,065 - mmdet - INFO - Epoch [2][2750/7330] lr: 1.000e-04, eta: 19:58:44, time: 0.905, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0531, loss_cls: 0.2315, acc: 92.0935, loss_bbox: 0.2755, loss_mask: 0.2717, loss: 0.8611 2024-05-28 12:02:36,403 - mmdet - INFO - Epoch [2][2800/7330] lr: 1.000e-04, eta: 19:58:15, time: 0.967, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0331, loss_rpn_bbox: 0.0570, loss_cls: 0.2479, acc: 91.7136, loss_bbox: 0.2978, loss_mask: 0.2838, loss: 0.9197 2024-05-28 12:03:28,770 - mmdet - INFO - Epoch [2][2850/7330] lr: 1.000e-04, eta: 19:58:16, time: 1.047, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0313, loss_rpn_bbox: 0.0582, loss_cls: 0.2335, acc: 91.9358, loss_bbox: 0.2889, loss_mask: 0.2802, loss: 0.8920 2024-05-28 12:04:16,909 - mmdet - INFO - Epoch [2][2900/7330] lr: 1.000e-04, eta: 19:57:44, time: 0.963, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0528, loss_cls: 0.2265, acc: 92.0957, loss_bbox: 0.2746, loss_mask: 0.2726, loss: 0.8551 2024-05-28 12:05:07,747 - mmdet - INFO - Epoch [2][2950/7330] lr: 1.000e-04, eta: 19:57:32, time: 1.016, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0596, loss_cls: 0.2286, acc: 92.0559, loss_bbox: 0.2793, loss_mask: 0.2775, loss: 0.8770 2024-05-28 12:05:53,867 - mmdet - INFO - Epoch [2][3000/7330] lr: 1.000e-04, eta: 19:56:45, time: 0.923, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0334, loss_rpn_bbox: 0.0594, loss_cls: 0.2565, acc: 91.1833, loss_bbox: 0.3031, loss_mask: 0.2856, loss: 0.9380 2024-05-28 12:06:40,367 - mmdet - INFO - Epoch [2][3050/7330] lr: 1.000e-04, eta: 19:56:01, time: 0.930, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0603, loss_cls: 0.2446, acc: 91.5288, loss_bbox: 0.2963, loss_mask: 0.2935, loss: 0.9286 2024-05-28 12:07:26,291 - mmdet - INFO - Epoch [2][3100/7330] lr: 1.000e-04, eta: 19:55:12, time: 0.918, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0564, loss_cls: 0.2416, acc: 91.7485, loss_bbox: 0.2922, loss_mask: 0.2890, loss: 0.9114 2024-05-28 12:08:12,887 - mmdet - INFO - Epoch [2][3150/7330] lr: 1.000e-04, eta: 19:54:29, time: 0.932, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0567, loss_cls: 0.2350, acc: 91.9897, loss_bbox: 0.2836, loss_mask: 0.2826, loss: 0.8881 2024-05-28 12:08:58,263 - mmdet - INFO - Epoch [2][3200/7330] lr: 1.000e-04, eta: 19:53:36, time: 0.907, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0316, loss_rpn_bbox: 0.0557, loss_cls: 0.2444, acc: 91.6880, loss_bbox: 0.2886, loss_mask: 0.2784, loss: 0.8987 2024-05-28 12:09:43,872 - mmdet - INFO - Epoch [2][3250/7330] lr: 1.000e-04, eta: 19:52:45, time: 0.912, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0324, loss_rpn_bbox: 0.0585, loss_cls: 0.2382, acc: 91.9014, loss_bbox: 0.2843, loss_mask: 0.2859, loss: 0.8992 2024-05-28 12:10:29,920 - mmdet - INFO - Epoch [2][3300/7330] lr: 1.000e-04, eta: 19:51:57, time: 0.921, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0343, loss_rpn_bbox: 0.0582, loss_cls: 0.2464, acc: 91.5300, loss_bbox: 0.2988, loss_mask: 0.2879, loss: 0.9256 2024-05-28 12:11:20,037 - mmdet - INFO - Epoch [2][3350/7330] lr: 1.000e-04, eta: 19:51:39, time: 1.003, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0548, loss_cls: 0.2396, acc: 91.7708, loss_bbox: 0.2894, loss_mask: 0.2839, loss: 0.8995 2024-05-28 12:12:05,547 - mmdet - INFO - Epoch [2][3400/7330] lr: 1.000e-04, eta: 19:50:48, time: 0.910, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0565, loss_cls: 0.2352, acc: 91.8403, loss_bbox: 0.2823, loss_mask: 0.2807, loss: 0.8843 2024-05-28 12:12:51,531 - mmdet - INFO - Epoch [2][3450/7330] lr: 1.000e-04, eta: 19:49:59, time: 0.920, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0592, loss_cls: 0.2385, acc: 91.7124, loss_bbox: 0.2925, loss_mask: 0.2820, loss: 0.9017 2024-05-28 12:13:36,696 - mmdet - INFO - Epoch [2][3500/7330] lr: 1.000e-04, eta: 19:49:05, time: 0.903, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0579, loss_cls: 0.2492, acc: 91.4219, loss_bbox: 0.2975, loss_mask: 0.2819, loss: 0.9200 2024-05-28 12:14:23,057 - mmdet - INFO - Epoch [2][3550/7330] lr: 1.000e-04, eta: 19:48:20, time: 0.927, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0562, loss_cls: 0.2384, acc: 91.9492, loss_bbox: 0.2879, loss_mask: 0.2822, loss: 0.8968 2024-05-28 12:15:08,396 - mmdet - INFO - Epoch [2][3600/7330] lr: 1.000e-04, eta: 19:47:27, time: 0.907, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0333, loss_rpn_bbox: 0.0571, loss_cls: 0.2479, acc: 91.6064, loss_bbox: 0.2915, loss_mask: 0.2812, loss: 0.9110 2024-05-28 12:15:53,980 - mmdet - INFO - Epoch [2][3650/7330] lr: 1.000e-04, eta: 19:46:36, time: 0.912, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0573, loss_cls: 0.2340, acc: 91.9329, loss_bbox: 0.2841, loss_mask: 0.2842, loss: 0.8918 2024-05-28 12:16:39,689 - mmdet - INFO - Epoch [2][3700/7330] lr: 1.000e-04, eta: 19:45:46, time: 0.914, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0564, loss_cls: 0.2478, acc: 91.6089, loss_bbox: 0.2926, loss_mask: 0.2863, loss: 0.9131 2024-05-28 12:17:25,841 - mmdet - INFO - Epoch [2][3750/7330] lr: 1.000e-04, eta: 19:44:59, time: 0.923, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0344, loss_rpn_bbox: 0.0613, loss_cls: 0.2466, acc: 91.4446, loss_bbox: 0.2975, loss_mask: 0.2855, loss: 0.9253 2024-05-28 12:18:11,285 - mmdet - INFO - Epoch [2][3800/7330] lr: 1.000e-04, eta: 19:44:08, time: 0.909, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0565, loss_cls: 0.2415, acc: 91.6711, loss_bbox: 0.2893, loss_mask: 0.2823, loss: 0.9014 2024-05-28 12:18:56,798 - mmdet - INFO - Epoch [2][3850/7330] lr: 1.000e-04, eta: 19:43:16, time: 0.910, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0569, loss_cls: 0.2449, acc: 91.5786, loss_bbox: 0.2957, loss_mask: 0.2862, loss: 0.9149 2024-05-28 12:19:51,194 - mmdet - INFO - Epoch [2][3900/7330] lr: 1.000e-04, eta: 19:43:26, time: 1.088, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0536, loss_cls: 0.2326, acc: 91.8186, loss_bbox: 0.2814, loss_mask: 0.2728, loss: 0.8677 2024-05-28 12:20:39,221 - mmdet - INFO - Epoch [2][3950/7330] lr: 1.000e-04, eta: 19:42:52, time: 0.960, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0304, loss_rpn_bbox: 0.0547, loss_cls: 0.2445, acc: 91.6948, loss_bbox: 0.2856, loss_mask: 0.2823, loss: 0.8975 2024-05-28 12:21:28,524 - mmdet - INFO - Epoch [2][4000/7330] lr: 1.000e-04, eta: 19:42:26, time: 0.986, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0565, loss_cls: 0.2398, acc: 91.9036, loss_bbox: 0.2814, loss_mask: 0.2827, loss: 0.8921 2024-05-28 12:22:16,194 - mmdet - INFO - Epoch [2][4050/7330] lr: 1.000e-04, eta: 19:41:49, time: 0.953, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0535, loss_cls: 0.2319, acc: 92.0161, loss_bbox: 0.2791, loss_mask: 0.2803, loss: 0.8720 2024-05-28 12:23:01,967 - mmdet - INFO - Epoch [2][4100/7330] lr: 1.000e-04, eta: 19:40:59, time: 0.915, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0582, loss_cls: 0.2490, acc: 91.5820, loss_bbox: 0.2944, loss_mask: 0.2815, loss: 0.9116 2024-05-28 12:23:47,586 - mmdet - INFO - Epoch [2][4150/7330] lr: 1.000e-04, eta: 19:40:08, time: 0.912, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0306, loss_rpn_bbox: 0.0528, loss_cls: 0.2340, acc: 91.9358, loss_bbox: 0.2823, loss_mask: 0.2776, loss: 0.8773 2024-05-28 12:24:32,698 - mmdet - INFO - Epoch [2][4200/7330] lr: 1.000e-04, eta: 19:39:14, time: 0.902, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0577, loss_cls: 0.2410, acc: 91.8040, loss_bbox: 0.2859, loss_mask: 0.2797, loss: 0.8917 2024-05-28 12:25:18,632 - mmdet - INFO - Epoch [2][4250/7330] lr: 1.000e-04, eta: 19:38:25, time: 0.919, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0314, loss_rpn_bbox: 0.0600, loss_cls: 0.2389, acc: 91.5449, loss_bbox: 0.2947, loss_mask: 0.2812, loss: 0.9062 2024-05-28 12:26:04,470 - mmdet - INFO - Epoch [2][4300/7330] lr: 1.000e-04, eta: 19:37:36, time: 0.917, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0608, loss_cls: 0.2511, acc: 91.3213, loss_bbox: 0.3024, loss_mask: 0.2913, loss: 0.9392 2024-05-28 12:26:49,933 - mmdet - INFO - Epoch [2][4350/7330] lr: 1.000e-04, eta: 19:36:45, time: 0.909, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0563, loss_cls: 0.2287, acc: 92.0649, loss_bbox: 0.2837, loss_mask: 0.2757, loss: 0.8754 2024-05-28 12:27:37,314 - mmdet - INFO - Epoch [2][4400/7330] lr: 1.000e-04, eta: 19:36:05, time: 0.948, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0549, loss_cls: 0.2238, acc: 92.3447, loss_bbox: 0.2695, loss_mask: 0.2739, loss: 0.8524 2024-05-28 12:28:23,125 - mmdet - INFO - Epoch [2][4450/7330] lr: 1.000e-04, eta: 19:35:16, time: 0.916, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0304, loss_rpn_bbox: 0.0584, loss_cls: 0.2415, acc: 91.7673, loss_bbox: 0.2850, loss_mask: 0.2795, loss: 0.8948 2024-05-28 12:29:09,066 - mmdet - INFO - Epoch [2][4500/7330] lr: 1.000e-04, eta: 19:34:28, time: 0.919, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0291, loss_rpn_bbox: 0.0551, loss_cls: 0.2420, acc: 91.6426, loss_bbox: 0.2959, loss_mask: 0.2825, loss: 0.9046 2024-05-28 12:29:54,883 - mmdet - INFO - Epoch [2][4550/7330] lr: 1.000e-04, eta: 19:33:38, time: 0.916, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0542, loss_cls: 0.2368, acc: 91.8813, loss_bbox: 0.2836, loss_mask: 0.2785, loss: 0.8834 2024-05-28 12:30:40,869 - mmdet - INFO - Epoch [2][4600/7330] lr: 1.000e-04, eta: 19:32:50, time: 0.920, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0572, loss_cls: 0.2305, acc: 92.0681, loss_bbox: 0.2813, loss_mask: 0.2721, loss: 0.8708 2024-05-28 12:31:26,578 - mmdet - INFO - Epoch [2][4650/7330] lr: 1.000e-04, eta: 19:32:00, time: 0.914, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0292, loss_rpn_bbox: 0.0565, loss_cls: 0.2392, acc: 91.8032, loss_bbox: 0.2885, loss_mask: 0.2797, loss: 0.8932 2024-05-28 12:32:12,240 - mmdet - INFO - Epoch [2][4700/7330] lr: 1.000e-04, eta: 19:31:10, time: 0.913, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0312, loss_rpn_bbox: 0.0556, loss_cls: 0.2326, acc: 91.9500, loss_bbox: 0.2859, loss_mask: 0.2777, loss: 0.8830 2024-05-28 12:32:57,761 - mmdet - INFO - Epoch [2][4750/7330] lr: 1.000e-04, eta: 19:30:19, time: 0.910, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0547, loss_cls: 0.2295, acc: 91.9641, loss_bbox: 0.2819, loss_mask: 0.2757, loss: 0.8714 2024-05-28 12:33:43,155 - mmdet - INFO - Epoch [2][4800/7330] lr: 1.000e-04, eta: 19:29:27, time: 0.908, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0551, loss_cls: 0.2330, acc: 91.9092, loss_bbox: 0.2838, loss_mask: 0.2711, loss: 0.8716 2024-05-28 12:34:29,607 - mmdet - INFO - Epoch [2][4850/7330] lr: 1.000e-04, eta: 19:28:42, time: 0.929, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0582, loss_cls: 0.2423, acc: 91.5203, loss_bbox: 0.2978, loss_mask: 0.2818, loss: 0.9119 2024-05-28 12:35:15,353 - mmdet - INFO - Epoch [2][4900/7330] lr: 1.000e-04, eta: 19:27:53, time: 0.915, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0552, loss_cls: 0.2324, acc: 91.9319, loss_bbox: 0.2905, loss_mask: 0.2739, loss: 0.8804 2024-05-28 12:36:03,499 - mmdet - INFO - Epoch [2][4950/7330] lr: 1.000e-04, eta: 19:27:18, time: 0.963, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0557, loss_cls: 0.2339, acc: 91.9363, loss_bbox: 0.2821, loss_mask: 0.2784, loss: 0.8803 2024-05-28 12:36:54,254 - mmdet - INFO - Epoch [2][5000/7330] lr: 1.000e-04, eta: 19:26:59, time: 1.015, data_time: 0.074, memory: 20925, loss_rpn_cls: 0.0307, loss_rpn_bbox: 0.0603, loss_cls: 0.2493, acc: 91.3418, loss_bbox: 0.3026, loss_mask: 0.2786, loss: 0.9215 2024-05-28 12:37:44,454 - mmdet - INFO - Epoch [2][5050/7330] lr: 1.000e-04, eta: 19:26:37, time: 1.004, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0306, loss_rpn_bbox: 0.0578, loss_cls: 0.2454, acc: 91.6560, loss_bbox: 0.2924, loss_mask: 0.2778, loss: 0.9041 2024-05-28 12:38:34,535 - mmdet - INFO - Epoch [2][5100/7330] lr: 1.000e-04, eta: 19:26:14, time: 1.002, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0585, loss_cls: 0.2393, acc: 91.7249, loss_bbox: 0.2919, loss_mask: 0.2790, loss: 0.9006 2024-05-28 12:39:20,234 - mmdet - INFO - Epoch [2][5150/7330] lr: 1.000e-04, eta: 19:25:23, time: 0.914, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0550, loss_cls: 0.2321, acc: 91.9692, loss_bbox: 0.2835, loss_mask: 0.2712, loss: 0.8736 2024-05-28 12:40:06,148 - mmdet - INFO - Epoch [2][5200/7330] lr: 1.000e-04, eta: 19:24:35, time: 0.918, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0306, loss_rpn_bbox: 0.0555, loss_cls: 0.2392, acc: 91.7405, loss_bbox: 0.2836, loss_mask: 0.2800, loss: 0.8890 2024-05-28 12:40:52,418 - mmdet - INFO - Epoch [2][5250/7330] lr: 1.000e-04, eta: 19:23:48, time: 0.925, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0561, loss_cls: 0.2309, acc: 91.9014, loss_bbox: 0.2851, loss_mask: 0.2741, loss: 0.8750 2024-05-28 12:41:38,021 - mmdet - INFO - Epoch [2][5300/7330] lr: 1.000e-04, eta: 19:22:57, time: 0.912, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0555, loss_cls: 0.2288, acc: 92.0701, loss_bbox: 0.2786, loss_mask: 0.2790, loss: 0.8744 2024-05-28 12:42:23,630 - mmdet - INFO - Epoch [2][5350/7330] lr: 1.000e-04, eta: 19:22:07, time: 0.912, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0531, loss_cls: 0.2289, acc: 92.0105, loss_bbox: 0.2768, loss_mask: 0.2769, loss: 0.8643 2024-05-28 12:43:08,999 - mmdet - INFO - Epoch [2][5400/7330] lr: 1.000e-04, eta: 19:21:15, time: 0.907, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0306, loss_rpn_bbox: 0.0551, loss_cls: 0.2346, acc: 91.9551, loss_bbox: 0.2824, loss_mask: 0.2764, loss: 0.8791 2024-05-28 12:43:55,535 - mmdet - INFO - Epoch [2][5450/7330] lr: 1.000e-04, eta: 19:20:30, time: 0.931, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0329, loss_rpn_bbox: 0.0602, loss_cls: 0.2402, acc: 91.7058, loss_bbox: 0.2914, loss_mask: 0.2774, loss: 0.9021 2024-05-28 12:44:43,621 - mmdet - INFO - Epoch [2][5500/7330] lr: 1.000e-04, eta: 19:19:54, time: 0.962, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0558, loss_cls: 0.2310, acc: 91.9419, loss_bbox: 0.2797, loss_mask: 0.2811, loss: 0.8743 2024-05-28 12:45:28,980 - mmdet - INFO - Epoch [2][5550/7330] lr: 1.000e-04, eta: 19:19:02, time: 0.907, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0540, loss_cls: 0.2375, acc: 91.8567, loss_bbox: 0.2758, loss_mask: 0.2705, loss: 0.8675 2024-05-28 12:46:15,316 - mmdet - INFO - Epoch [2][5600/7330] lr: 1.000e-04, eta: 19:18:16, time: 0.927, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0579, loss_cls: 0.2379, acc: 91.7871, loss_bbox: 0.2849, loss_mask: 0.2783, loss: 0.8908 2024-05-28 12:47:00,634 - mmdet - INFO - Epoch [2][5650/7330] lr: 1.000e-04, eta: 19:17:24, time: 0.906, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0549, loss_cls: 0.2336, acc: 91.9070, loss_bbox: 0.2836, loss_mask: 0.2701, loss: 0.8707 2024-05-28 12:47:46,456 - mmdet - INFO - Epoch [2][5700/7330] lr: 1.000e-04, eta: 19:16:35, time: 0.916, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0553, loss_cls: 0.2390, acc: 91.6882, loss_bbox: 0.2892, loss_mask: 0.2710, loss: 0.8847 2024-05-28 12:48:32,463 - mmdet - INFO - Epoch [2][5750/7330] lr: 1.000e-04, eta: 19:15:47, time: 0.920, data_time: 0.065, memory: 20925, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0529, loss_cls: 0.2250, acc: 92.1758, loss_bbox: 0.2724, loss_mask: 0.2746, loss: 0.8520 2024-05-28 12:49:18,314 - mmdet - INFO - Epoch [2][5800/7330] lr: 1.000e-04, eta: 19:14:58, time: 0.918, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0550, loss_cls: 0.2296, acc: 92.0581, loss_bbox: 0.2797, loss_mask: 0.2727, loss: 0.8648 2024-05-28 12:50:03,586 - mmdet - INFO - Epoch [2][5850/7330] lr: 1.000e-04, eta: 19:14:06, time: 0.905, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0545, loss_cls: 0.2300, acc: 91.9185, loss_bbox: 0.2768, loss_mask: 0.2731, loss: 0.8632 2024-05-28 12:50:49,667 - mmdet - INFO - Epoch [2][5900/7330] lr: 1.000e-04, eta: 19:13:18, time: 0.921, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0281, loss_rpn_bbox: 0.0553, loss_cls: 0.2356, acc: 91.7664, loss_bbox: 0.2866, loss_mask: 0.2810, loss: 0.8865 2024-05-28 12:51:36,031 - mmdet - INFO - Epoch [2][5950/7330] lr: 1.000e-04, eta: 19:12:32, time: 0.928, data_time: 0.064, memory: 20925, loss_rpn_cls: 0.0330, loss_rpn_bbox: 0.0591, loss_cls: 0.2360, acc: 91.6855, loss_bbox: 0.2815, loss_mask: 0.2698, loss: 0.8794 2024-05-28 12:52:21,188 - mmdet - INFO - Epoch [2][6000/7330] lr: 1.000e-04, eta: 19:11:40, time: 0.903, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0307, loss_rpn_bbox: 0.0533, loss_cls: 0.2345, acc: 91.8352, loss_bbox: 0.2839, loss_mask: 0.2760, loss: 0.8784 2024-05-28 12:53:13,475 - mmdet - INFO - Epoch [2][6050/7330] lr: 1.000e-04, eta: 19:11:27, time: 1.046, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0515, loss_cls: 0.2261, acc: 92.1089, loss_bbox: 0.2744, loss_mask: 0.2646, loss: 0.8444 2024-05-28 12:54:01,426 - mmdet - INFO - Epoch [2][6100/7330] lr: 1.000e-04, eta: 19:10:50, time: 0.959, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0538, loss_cls: 0.2326, acc: 92.0139, loss_bbox: 0.2778, loss_mask: 0.2687, loss: 0.8608 2024-05-28 12:54:49,492 - mmdet - INFO - Epoch [2][6150/7330] lr: 1.000e-04, eta: 19:10:13, time: 0.961, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0289, loss_rpn_bbox: 0.0555, loss_cls: 0.2322, acc: 92.0012, loss_bbox: 0.2785, loss_mask: 0.2702, loss: 0.8653 2024-05-28 12:55:39,162 - mmdet - INFO - Epoch [2][6200/7330] lr: 1.000e-04, eta: 19:09:45, time: 0.993, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0488, loss_cls: 0.2150, acc: 92.4270, loss_bbox: 0.2679, loss_mask: 0.2668, loss: 0.8239 2024-05-28 12:56:25,012 - mmdet - INFO - Epoch [2][6250/7330] lr: 1.000e-04, eta: 19:08:56, time: 0.917, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0550, loss_cls: 0.2248, acc: 92.2283, loss_bbox: 0.2775, loss_mask: 0.2702, loss: 0.8547 2024-05-28 12:57:10,581 - mmdet - INFO - Epoch [2][6300/7330] lr: 1.000e-04, eta: 19:08:05, time: 0.911, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0561, loss_cls: 0.2308, acc: 92.1436, loss_bbox: 0.2731, loss_mask: 0.2715, loss: 0.8590 2024-05-28 12:57:56,417 - mmdet - INFO - Epoch [2][6350/7330] lr: 1.000e-04, eta: 19:07:16, time: 0.917, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0289, loss_rpn_bbox: 0.0555, loss_cls: 0.2333, acc: 91.9407, loss_bbox: 0.2785, loss_mask: 0.2729, loss: 0.8692 2024-05-28 12:58:42,525 - mmdet - INFO - Epoch [2][6400/7330] lr: 1.000e-04, eta: 19:06:29, time: 0.922, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0582, loss_cls: 0.2426, acc: 91.5635, loss_bbox: 0.2885, loss_mask: 0.2721, loss: 0.8909 2024-05-28 12:59:27,597 - mmdet - INFO - Epoch [2][6450/7330] lr: 1.000e-04, eta: 19:05:36, time: 0.901, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0539, loss_cls: 0.2278, acc: 92.0525, loss_bbox: 0.2780, loss_mask: 0.2724, loss: 0.8624 2024-05-28 13:00:13,456 - mmdet - INFO - Epoch [2][6500/7330] lr: 1.000e-04, eta: 19:04:47, time: 0.917, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0548, loss_cls: 0.2401, acc: 91.7070, loss_bbox: 0.2916, loss_mask: 0.2756, loss: 0.8904 2024-05-28 13:01:01,250 - mmdet - INFO - Epoch [2][6550/7330] lr: 1.000e-04, eta: 19:04:08, time: 0.956, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0516, loss_cls: 0.2223, acc: 92.3262, loss_bbox: 0.2653, loss_mask: 0.2683, loss: 0.8339 2024-05-28 13:01:46,985 - mmdet - INFO - Epoch [2][6600/7330] lr: 1.000e-04, eta: 19:03:19, time: 0.915, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0530, loss_cls: 0.2236, acc: 92.1775, loss_bbox: 0.2725, loss_mask: 0.2701, loss: 0.8466 2024-05-28 13:02:32,656 - mmdet - INFO - Epoch [2][6650/7330] lr: 1.000e-04, eta: 19:02:29, time: 0.913, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0577, loss_cls: 0.2346, acc: 91.8411, loss_bbox: 0.2870, loss_mask: 0.2718, loss: 0.8819 2024-05-28 13:03:18,576 - mmdet - INFO - Epoch [2][6700/7330] lr: 1.000e-04, eta: 19:01:40, time: 0.918, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0534, loss_cls: 0.2269, acc: 92.0886, loss_bbox: 0.2804, loss_mask: 0.2715, loss: 0.8597 2024-05-28 13:04:04,107 - mmdet - INFO - Epoch [2][6750/7330] lr: 1.000e-04, eta: 19:00:50, time: 0.911, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0507, loss_cls: 0.2157, acc: 92.4783, loss_bbox: 0.2675, loss_mask: 0.2664, loss: 0.8271 2024-05-28 13:04:49,767 - mmdet - INFO - Epoch [2][6800/7330] lr: 1.000e-04, eta: 19:00:00, time: 0.913, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0575, loss_cls: 0.2406, acc: 91.6819, loss_bbox: 0.2934, loss_mask: 0.2722, loss: 0.8949 2024-05-28 13:05:35,258 - mmdet - INFO - Epoch [2][6850/7330] lr: 1.000e-04, eta: 18:59:09, time: 0.910, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0551, loss_cls: 0.2202, acc: 92.2625, loss_bbox: 0.2718, loss_mask: 0.2725, loss: 0.8500 2024-05-28 13:06:22,118 - mmdet - INFO - Epoch [2][6900/7330] lr: 1.000e-04, eta: 18:58:26, time: 0.937, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0602, loss_cls: 0.2500, acc: 91.3020, loss_bbox: 0.2968, loss_mask: 0.2801, loss: 0.9208 2024-05-28 13:07:07,771 - mmdet - INFO - Epoch [2][6950/7330] lr: 1.000e-04, eta: 18:57:36, time: 0.913, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0570, loss_cls: 0.2354, acc: 91.9304, loss_bbox: 0.2807, loss_mask: 0.2701, loss: 0.8730 2024-05-28 13:07:53,836 - mmdet - INFO - Epoch [2][7000/7330] lr: 1.000e-04, eta: 18:56:48, time: 0.921, data_time: 0.064, memory: 20925, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0555, loss_cls: 0.2334, acc: 91.8171, loss_bbox: 0.2859, loss_mask: 0.2724, loss: 0.8776 2024-05-28 13:08:39,448 - mmdet - INFO - Epoch [2][7050/7330] lr: 1.000e-04, eta: 18:55:58, time: 0.912, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0528, loss_cls: 0.2230, acc: 92.3364, loss_bbox: 0.2669, loss_mask: 0.2656, loss: 0.8350 2024-05-28 13:09:27,518 - mmdet - INFO - Epoch [2][7100/7330] lr: 1.000e-04, eta: 18:55:21, time: 0.961, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0541, loss_cls: 0.2273, acc: 91.9717, loss_bbox: 0.2798, loss_mask: 0.2721, loss: 0.8608 2024-05-28 13:10:17,502 - mmdet - INFO - Epoch [2][7150/7330] lr: 1.000e-04, eta: 18:54:53, time: 1.000, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0546, loss_cls: 0.2262, acc: 92.2261, loss_bbox: 0.2728, loss_mask: 0.2668, loss: 0.8490 2024-05-28 13:11:05,458 - mmdet - INFO - Epoch [2][7200/7330] lr: 1.000e-04, eta: 18:54:15, time: 0.959, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0529, loss_cls: 0.2279, acc: 92.0647, loss_bbox: 0.2757, loss_mask: 0.2662, loss: 0.8481 2024-05-28 13:11:58,960 - mmdet - INFO - Epoch [2][7250/7330] lr: 1.000e-04, eta: 18:54:05, time: 1.070, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0319, loss_rpn_bbox: 0.0551, loss_cls: 0.2311, acc: 91.8560, loss_bbox: 0.2867, loss_mask: 0.2749, loss: 0.8797 2024-05-28 13:12:44,429 - mmdet - INFO - Epoch [2][7300/7330] lr: 1.000e-04, eta: 18:53:14, time: 0.909, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0538, loss_cls: 0.2290, acc: 92.0447, loss_bbox: 0.2767, loss_mask: 0.2653, loss: 0.8515 2024-05-28 13:13:12,587 - mmdet - INFO - Saving checkpoint at 2 epochs 2024-05-28 13:15:24,328 - mmdet - INFO - Evaluating bbox... 2024-05-28 13:15:53,043 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.396 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.652 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.425 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.243 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.439 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.536 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.524 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.524 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.524 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.337 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.566 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.681 2024-05-28 13:15:53,043 - mmdet - INFO - Evaluating segm... 2024-05-28 13:16:23,207 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.368 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.613 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.384 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.168 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.407 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.571 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.484 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.484 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.484 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.274 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.531 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.674 2024-05-28 13:16:23,760 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 13:16:23,761 - mmdet - INFO - Epoch(val) [2][625] bbox_mAP: 0.3960, bbox_mAP_50: 0.6520, bbox_mAP_75: 0.4250, bbox_mAP_s: 0.2430, bbox_mAP_m: 0.4390, bbox_mAP_l: 0.5360, bbox_mAP_copypaste: 0.396 0.652 0.425 0.243 0.439 0.536, segm_mAP: 0.3680, segm_mAP_50: 0.6130, segm_mAP_75: 0.3840, segm_mAP_s: 0.1680, segm_mAP_m: 0.4070, segm_mAP_l: 0.5710, segm_mAP_copypaste: 0.368 0.613 0.384 0.168 0.407 0.571 2024-05-28 13:17:21,956 - mmdet - INFO - Epoch [3][50/7330] lr: 1.000e-04, eta: 18:50:40, time: 1.163, data_time: 0.121, memory: 20925, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0551, loss_cls: 0.2247, acc: 92.0764, loss_bbox: 0.2729, loss_mask: 0.2706, loss: 0.8535 2024-05-28 13:18:08,486 - mmdet - INFO - Epoch [3][100/7330] lr: 1.000e-04, eta: 18:49:55, time: 0.931, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0291, loss_rpn_bbox: 0.0534, loss_cls: 0.2199, acc: 92.2979, loss_bbox: 0.2729, loss_mask: 0.2610, loss: 0.8363 2024-05-28 13:18:54,390 - mmdet - INFO - Epoch [3][150/7330] lr: 1.000e-04, eta: 18:49:07, time: 0.918, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0566, loss_cls: 0.2281, acc: 91.9358, loss_bbox: 0.2787, loss_mask: 0.2694, loss: 0.8609 2024-05-28 13:19:40,035 - mmdet - INFO - Epoch [3][200/7330] lr: 1.000e-04, eta: 18:48:17, time: 0.912, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0487, loss_cls: 0.2134, acc: 92.5659, loss_bbox: 0.2650, loss_mask: 0.2644, loss: 0.8166 2024-05-28 13:20:25,995 - mmdet - INFO - Epoch [3][250/7330] lr: 1.000e-04, eta: 18:47:29, time: 0.920, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0523, loss_cls: 0.2237, acc: 92.1265, loss_bbox: 0.2764, loss_mask: 0.2642, loss: 0.8430 2024-05-28 13:21:11,949 - mmdet - INFO - Epoch [3][300/7330] lr: 1.000e-04, eta: 18:46:41, time: 0.919, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0525, loss_cls: 0.2143, acc: 92.3323, loss_bbox: 0.2707, loss_mask: 0.2587, loss: 0.8204 2024-05-28 13:21:59,279 - mmdet - INFO - Epoch [3][350/7330] lr: 1.000e-04, eta: 18:46:00, time: 0.947, data_time: 0.067, memory: 20925, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0552, loss_cls: 0.2250, acc: 92.1121, loss_bbox: 0.2727, loss_mask: 0.2650, loss: 0.8465 2024-05-28 13:22:44,694 - mmdet - INFO - Epoch [3][400/7330] lr: 1.000e-04, eta: 18:45:09, time: 0.908, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0491, loss_cls: 0.2128, acc: 92.4932, loss_bbox: 0.2631, loss_mask: 0.2611, loss: 0.8115 2024-05-28 13:23:33,939 - mmdet - INFO - Epoch [3][450/7330] lr: 1.000e-04, eta: 18:44:37, time: 0.985, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0540, loss_cls: 0.2258, acc: 92.0112, loss_bbox: 0.2809, loss_mask: 0.2732, loss: 0.8618 2024-05-28 13:24:20,608 - mmdet - INFO - Epoch [3][500/7330] lr: 1.000e-04, eta: 18:43:52, time: 0.933, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0542, loss_cls: 0.2231, acc: 92.1648, loss_bbox: 0.2780, loss_mask: 0.2638, loss: 0.8460 2024-05-28 13:25:11,313 - mmdet - INFO - Epoch [3][550/7330] lr: 1.000e-04, eta: 18:43:27, time: 1.014, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0560, loss_cls: 0.2245, acc: 92.1184, loss_bbox: 0.2770, loss_mask: 0.2733, loss: 0.8561 2024-05-28 13:25:57,425 - mmdet - INFO - Epoch [3][600/7330] lr: 1.000e-04, eta: 18:42:40, time: 0.922, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0488, loss_cls: 0.2236, acc: 92.1262, loss_bbox: 0.2708, loss_mask: 0.2600, loss: 0.8270 2024-05-28 13:26:45,591 - mmdet - INFO - Epoch [3][650/7330] lr: 1.000e-04, eta: 18:42:02, time: 0.963, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0549, loss_cls: 0.2305, acc: 91.8755, loss_bbox: 0.2856, loss_mask: 0.2628, loss: 0.8585 2024-05-28 13:27:31,494 - mmdet - INFO - Epoch [3][700/7330] lr: 1.000e-04, eta: 18:41:14, time: 0.918, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0503, loss_cls: 0.2151, acc: 92.3394, loss_bbox: 0.2715, loss_mask: 0.2625, loss: 0.8244 2024-05-28 13:28:20,147 - mmdet - INFO - Epoch [3][750/7330] lr: 1.000e-04, eta: 18:40:38, time: 0.973, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0281, loss_rpn_bbox: 0.0534, loss_cls: 0.2311, acc: 91.8264, loss_bbox: 0.2813, loss_mask: 0.2706, loss: 0.8645 2024-05-28 13:29:08,270 - mmdet - INFO - Epoch [3][800/7330] lr: 1.000e-04, eta: 18:40:00, time: 0.962, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0513, loss_cls: 0.2235, acc: 92.1416, loss_bbox: 0.2747, loss_mask: 0.2673, loss: 0.8411 2024-05-28 13:29:56,435 - mmdet - INFO - Epoch [3][850/7330] lr: 1.000e-04, eta: 18:39:23, time: 0.963, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0527, loss_cls: 0.2146, acc: 92.3887, loss_bbox: 0.2652, loss_mask: 0.2659, loss: 0.8255 2024-05-28 13:30:42,636 - mmdet - INFO - Epoch [3][900/7330] lr: 1.000e-04, eta: 18:38:35, time: 0.924, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0548, loss_cls: 0.2234, acc: 92.1182, loss_bbox: 0.2755, loss_mask: 0.2630, loss: 0.8430 2024-05-28 13:31:28,669 - mmdet - INFO - Epoch [3][950/7330] lr: 1.000e-04, eta: 18:37:48, time: 0.921, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0526, loss_cls: 0.2184, acc: 92.1345, loss_bbox: 0.2793, loss_mask: 0.2696, loss: 0.8455 2024-05-28 13:32:14,708 - mmdet - INFO - Epoch [3][1000/7330] lr: 1.000e-04, eta: 18:37:00, time: 0.921, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0530, loss_cls: 0.2260, acc: 91.9685, loss_bbox: 0.2795, loss_mask: 0.2704, loss: 0.8563 2024-05-28 13:33:00,470 - mmdet - INFO - Epoch [3][1050/7330] lr: 1.000e-04, eta: 18:36:11, time: 0.915, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0519, loss_cls: 0.2206, acc: 92.3167, loss_bbox: 0.2698, loss_mask: 0.2608, loss: 0.8277 2024-05-28 13:33:48,645 - mmdet - INFO - Epoch [3][1100/7330] lr: 1.000e-04, eta: 18:35:33, time: 0.964, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0520, loss_cls: 0.2112, acc: 92.4595, loss_bbox: 0.2638, loss_mask: 0.2679, loss: 0.8234 2024-05-28 13:34:35,086 - mmdet - INFO - Epoch [3][1150/7330] lr: 1.000e-04, eta: 18:34:47, time: 0.929, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0541, loss_cls: 0.2266, acc: 92.0679, loss_bbox: 0.2720, loss_mask: 0.2690, loss: 0.8502 2024-05-28 13:35:21,187 - mmdet - INFO - Epoch [3][1200/7330] lr: 1.000e-04, eta: 18:33:59, time: 0.922, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0518, loss_cls: 0.2136, acc: 92.4043, loss_bbox: 0.2648, loss_mask: 0.2593, loss: 0.8136 2024-05-28 13:36:07,633 - mmdet - INFO - Epoch [3][1250/7330] lr: 1.000e-04, eta: 18:33:13, time: 0.929, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0533, loss_cls: 0.2251, acc: 92.0808, loss_bbox: 0.2767, loss_mask: 0.2673, loss: 0.8482 2024-05-28 13:36:54,084 - mmdet - INFO - Epoch [3][1300/7330] lr: 1.000e-04, eta: 18:32:28, time: 0.929, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0517, loss_cls: 0.2172, acc: 92.3472, loss_bbox: 0.2692, loss_mask: 0.2620, loss: 0.8253 2024-05-28 13:37:40,335 - mmdet - INFO - Epoch [3][1350/7330] lr: 1.000e-04, eta: 18:31:41, time: 0.925, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0524, loss_cls: 0.2294, acc: 92.1138, loss_bbox: 0.2743, loss_mask: 0.2693, loss: 0.8499 2024-05-28 13:38:26,260 - mmdet - INFO - Epoch [3][1400/7330] lr: 1.000e-04, eta: 18:30:52, time: 0.918, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0509, loss_cls: 0.2146, acc: 92.4631, loss_bbox: 0.2650, loss_mask: 0.2675, loss: 0.8229 2024-05-28 13:39:12,943 - mmdet - INFO - Epoch [3][1450/7330] lr: 1.000e-04, eta: 18:30:08, time: 0.934, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0525, loss_cls: 0.2236, acc: 92.1169, loss_bbox: 0.2725, loss_mask: 0.2665, loss: 0.8407 2024-05-28 13:39:59,223 - mmdet - INFO - Epoch [3][1500/7330] lr: 1.000e-04, eta: 18:29:21, time: 0.926, data_time: 0.064, memory: 20925, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0537, loss_cls: 0.2222, acc: 92.2314, loss_bbox: 0.2728, loss_mask: 0.2697, loss: 0.8460 2024-05-28 13:40:47,906 - mmdet - INFO - Epoch [3][1550/7330] lr: 1.000e-04, eta: 18:28:45, time: 0.974, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0525, loss_cls: 0.2233, acc: 92.0852, loss_bbox: 0.2745, loss_mask: 0.2749, loss: 0.8532 2024-05-28 13:41:35,943 - mmdet - INFO - Epoch [3][1600/7330] lr: 1.000e-04, eta: 18:28:06, time: 0.961, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0515, loss_cls: 0.2120, acc: 92.5259, loss_bbox: 0.2664, loss_mask: 0.2667, loss: 0.8217 2024-05-28 13:42:24,481 - mmdet - INFO - Epoch [3][1650/7330] lr: 1.000e-04, eta: 18:27:29, time: 0.971, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0535, loss_cls: 0.2245, acc: 92.0671, loss_bbox: 0.2805, loss_mask: 0.2687, loss: 0.8551 2024-05-28 13:43:10,820 - mmdet - INFO - Epoch [3][1700/7330] lr: 1.000e-04, eta: 18:26:43, time: 0.927, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0525, loss_cls: 0.2108, acc: 92.5833, loss_bbox: 0.2617, loss_mask: 0.2639, loss: 0.8122 2024-05-28 13:43:59,529 - mmdet - INFO - Epoch [3][1750/7330] lr: 1.000e-04, eta: 18:26:06, time: 0.974, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0524, loss_cls: 0.2184, acc: 92.3000, loss_bbox: 0.2708, loss_mask: 0.2601, loss: 0.8266 2024-05-28 13:44:46,028 - mmdet - INFO - Epoch [3][1800/7330] lr: 1.000e-04, eta: 18:25:21, time: 0.930, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0526, loss_cls: 0.2231, acc: 91.9968, loss_bbox: 0.2819, loss_mask: 0.2631, loss: 0.8474 2024-05-28 13:45:35,025 - mmdet - INFO - Epoch [3][1850/7330] lr: 1.000e-04, eta: 18:24:45, time: 0.980, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0500, loss_cls: 0.2159, acc: 92.4124, loss_bbox: 0.2618, loss_mask: 0.2570, loss: 0.8084 2024-05-28 13:46:27,649 - mmdet - INFO - Epoch [3][1900/7330] lr: 1.000e-04, eta: 18:24:26, time: 1.052, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0528, loss_cls: 0.2214, acc: 92.2532, loss_bbox: 0.2712, loss_mask: 0.2594, loss: 0.8314 2024-05-28 13:47:13,757 - mmdet - INFO - Epoch [3][1950/7330] lr: 1.000e-04, eta: 18:23:38, time: 0.922, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0558, loss_cls: 0.2106, acc: 92.6367, loss_bbox: 0.2590, loss_mask: 0.2633, loss: 0.8173 2024-05-28 13:47:59,854 - mmdet - INFO - Epoch [3][2000/7330] lr: 1.000e-04, eta: 18:22:51, time: 0.922, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0513, loss_cls: 0.2157, acc: 92.3594, loss_bbox: 0.2667, loss_mask: 0.2699, loss: 0.8300 2024-05-28 13:48:45,791 - mmdet - INFO - Epoch [3][2050/7330] lr: 1.000e-04, eta: 18:22:02, time: 0.919, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0525, loss_cls: 0.2101, acc: 92.5527, loss_bbox: 0.2597, loss_mask: 0.2586, loss: 0.8056 2024-05-28 13:49:31,765 - mmdet - INFO - Epoch [3][2100/7330] lr: 1.000e-04, eta: 18:21:14, time: 0.919, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0530, loss_cls: 0.2256, acc: 91.9429, loss_bbox: 0.2796, loss_mask: 0.2644, loss: 0.8495 2024-05-28 13:50:19,620 - mmdet - INFO - Epoch [3][2150/7330] lr: 1.000e-04, eta: 18:20:34, time: 0.957, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0539, loss_cls: 0.2189, acc: 92.2976, loss_bbox: 0.2697, loss_mask: 0.2679, loss: 0.8388 2024-05-28 13:51:05,138 - mmdet - INFO - Epoch [3][2200/7330] lr: 1.000e-04, eta: 18:19:44, time: 0.910, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0525, loss_cls: 0.2209, acc: 92.2439, loss_bbox: 0.2734, loss_mask: 0.2676, loss: 0.8424 2024-05-28 13:51:51,634 - mmdet - INFO - Epoch [3][2250/7330] lr: 1.000e-04, eta: 18:18:58, time: 0.930, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0515, loss_cls: 0.2231, acc: 92.2524, loss_bbox: 0.2691, loss_mask: 0.2673, loss: 0.8381 2024-05-28 13:52:37,770 - mmdet - INFO - Epoch [3][2300/7330] lr: 1.000e-04, eta: 18:18:10, time: 0.923, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0537, loss_cls: 0.2253, acc: 92.1147, loss_bbox: 0.2782, loss_mask: 0.2647, loss: 0.8489 2024-05-28 13:53:23,749 - mmdet - INFO - Epoch [3][2350/7330] lr: 1.000e-04, eta: 18:17:22, time: 0.920, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0506, loss_cls: 0.2220, acc: 92.1665, loss_bbox: 0.2683, loss_mask: 0.2622, loss: 0.8266 2024-05-28 13:54:11,005 - mmdet - INFO - Epoch [3][2400/7330] lr: 1.000e-04, eta: 18:16:39, time: 0.945, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0545, loss_cls: 0.2356, acc: 91.5269, loss_bbox: 0.2880, loss_mask: 0.2708, loss: 0.8758 2024-05-28 13:54:56,698 - mmdet - INFO - Epoch [3][2450/7330] lr: 1.000e-04, eta: 18:15:50, time: 0.914, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0514, loss_cls: 0.2112, acc: 92.4910, loss_bbox: 0.2613, loss_mask: 0.2631, loss: 0.8127 2024-05-28 13:55:42,660 - mmdet - INFO - Epoch [3][2500/7330] lr: 1.000e-04, eta: 18:15:01, time: 0.919, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0530, loss_cls: 0.2213, acc: 92.1865, loss_bbox: 0.2746, loss_mask: 0.2591, loss: 0.8336 2024-05-28 13:56:29,377 - mmdet - INFO - Epoch [3][2550/7330] lr: 1.000e-04, eta: 18:14:16, time: 0.934, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0532, loss_cls: 0.2268, acc: 92.0925, loss_bbox: 0.2749, loss_mask: 0.2674, loss: 0.8502 2024-05-28 13:57:19,218 - mmdet - INFO - Epoch [3][2600/7330] lr: 1.000e-04, eta: 18:13:44, time: 0.997, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0534, loss_cls: 0.2217, acc: 92.2090, loss_bbox: 0.2754, loss_mask: 0.2684, loss: 0.8456 2024-05-28 13:58:06,988 - mmdet - INFO - Epoch [3][2650/7330] lr: 1.000e-04, eta: 18:13:03, time: 0.955, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0549, loss_cls: 0.2206, acc: 92.1965, loss_bbox: 0.2787, loss_mask: 0.2720, loss: 0.8504 2024-05-28 13:58:56,678 - mmdet - INFO - Epoch [3][2700/7330] lr: 1.000e-04, eta: 18:12:30, time: 0.994, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0490, loss_cls: 0.2047, acc: 92.9092, loss_bbox: 0.2504, loss_mask: 0.2522, loss: 0.7803 2024-05-28 13:59:42,785 - mmdet - INFO - Epoch [3][2750/7330] lr: 1.000e-04, eta: 18:11:42, time: 0.922, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0521, loss_cls: 0.2217, acc: 92.2576, loss_bbox: 0.2708, loss_mask: 0.2623, loss: 0.8328 2024-05-28 14:00:31,199 - mmdet - INFO - Epoch [3][2800/7330] lr: 1.000e-04, eta: 18:11:04, time: 0.968, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0515, loss_cls: 0.2155, acc: 92.3213, loss_bbox: 0.2712, loss_mask: 0.2640, loss: 0.8272 2024-05-28 14:01:17,098 - mmdet - INFO - Epoch [3][2850/7330] lr: 1.000e-04, eta: 18:10:15, time: 0.918, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0482, loss_cls: 0.2148, acc: 92.3643, loss_bbox: 0.2643, loss_mask: 0.2587, loss: 0.8105 2024-05-28 14:02:05,303 - mmdet - INFO - Epoch [3][2900/7330] lr: 1.000e-04, eta: 18:09:36, time: 0.964, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0476, loss_cls: 0.2169, acc: 92.4675, loss_bbox: 0.2621, loss_mask: 0.2600, loss: 0.8117 2024-05-28 14:02:54,209 - mmdet - INFO - Epoch [3][2950/7330] lr: 1.000e-04, eta: 18:09:00, time: 0.978, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0533, loss_cls: 0.2152, acc: 92.3804, loss_bbox: 0.2687, loss_mask: 0.2564, loss: 0.8191 2024-05-28 14:03:45,315 - mmdet - INFO - Epoch [3][3000/7330] lr: 1.000e-04, eta: 18:08:32, time: 1.022, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0525, loss_cls: 0.2253, acc: 92.0798, loss_bbox: 0.2705, loss_mask: 0.2626, loss: 0.8371 2024-05-28 14:04:31,767 - mmdet - INFO - Epoch [3][3050/7330] lr: 1.000e-04, eta: 18:07:45, time: 0.929, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0537, loss_cls: 0.2265, acc: 91.9656, loss_bbox: 0.2776, loss_mask: 0.2699, loss: 0.8553 2024-05-28 14:05:18,185 - mmdet - INFO - Epoch [3][3100/7330] lr: 1.000e-04, eta: 18:06:59, time: 0.928, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0534, loss_cls: 0.2293, acc: 91.9685, loss_bbox: 0.2753, loss_mask: 0.2640, loss: 0.8484 2024-05-28 14:06:05,012 - mmdet - INFO - Epoch [3][3150/7330] lr: 1.000e-04, eta: 18:06:14, time: 0.937, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0582, loss_cls: 0.2216, acc: 92.0696, loss_bbox: 0.2732, loss_mask: 0.2643, loss: 0.8442 2024-05-28 14:06:51,499 - mmdet - INFO - Epoch [3][3200/7330] lr: 1.000e-04, eta: 18:05:27, time: 0.930, data_time: 0.066, memory: 20925, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0552, loss_cls: 0.2168, acc: 92.2981, loss_bbox: 0.2718, loss_mask: 0.2647, loss: 0.8352 2024-05-28 14:07:39,613 - mmdet - INFO - Epoch [3][3250/7330] lr: 1.000e-04, eta: 18:04:47, time: 0.962, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0518, loss_cls: 0.2126, acc: 92.5901, loss_bbox: 0.2582, loss_mask: 0.2633, loss: 0.8113 2024-05-28 14:08:24,861 - mmdet - INFO - Epoch [3][3300/7330] lr: 1.000e-04, eta: 18:03:56, time: 0.905, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0507, loss_cls: 0.2117, acc: 92.5203, loss_bbox: 0.2632, loss_mask: 0.2589, loss: 0.8112 2024-05-28 14:09:10,741 - mmdet - INFO - Epoch [3][3350/7330] lr: 1.000e-04, eta: 18:03:08, time: 0.918, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0525, loss_cls: 0.2151, acc: 92.4258, loss_bbox: 0.2650, loss_mask: 0.2499, loss: 0.8056 2024-05-28 14:09:56,488 - mmdet - INFO - Epoch [3][3400/7330] lr: 1.000e-04, eta: 18:02:18, time: 0.915, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0522, loss_cls: 0.2134, acc: 92.3745, loss_bbox: 0.2658, loss_mask: 0.2636, loss: 0.8220 2024-05-28 14:10:42,711 - mmdet - INFO - Epoch [3][3450/7330] lr: 1.000e-04, eta: 18:01:31, time: 0.924, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0518, loss_cls: 0.2185, acc: 92.2903, loss_bbox: 0.2676, loss_mask: 0.2623, loss: 0.8254 2024-05-28 14:11:28,583 - mmdet - INFO - Epoch [3][3500/7330] lr: 1.000e-04, eta: 18:00:42, time: 0.917, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0484, loss_cls: 0.2080, acc: 92.6353, loss_bbox: 0.2582, loss_mask: 0.2585, loss: 0.7985 2024-05-28 14:12:14,606 - mmdet - INFO - Epoch [3][3550/7330] lr: 1.000e-04, eta: 17:59:54, time: 0.920, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0515, loss_cls: 0.2126, acc: 92.4509, loss_bbox: 0.2604, loss_mask: 0.2603, loss: 0.8113 2024-05-28 14:13:00,411 - mmdet - INFO - Epoch [3][3600/7330] lr: 1.000e-04, eta: 17:59:05, time: 0.916, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0521, loss_cls: 0.2176, acc: 92.2039, loss_bbox: 0.2668, loss_mask: 0.2571, loss: 0.8188 2024-05-28 14:13:46,976 - mmdet - INFO - Epoch [3][3650/7330] lr: 1.000e-04, eta: 17:58:19, time: 0.931, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0537, loss_cls: 0.2268, acc: 91.9751, loss_bbox: 0.2765, loss_mask: 0.2570, loss: 0.8417 2024-05-28 14:14:35,719 - mmdet - INFO - Epoch [3][3700/7330] lr: 1.000e-04, eta: 17:57:42, time: 0.975, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0526, loss_cls: 0.2146, acc: 92.4167, loss_bbox: 0.2648, loss_mask: 0.2630, loss: 0.8222 2024-05-28 14:15:23,881 - mmdet - INFO - Epoch [3][3750/7330] lr: 1.000e-04, eta: 17:57:02, time: 0.963, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0505, loss_cls: 0.2163, acc: 92.3328, loss_bbox: 0.2631, loss_mask: 0.2551, loss: 0.8100 2024-05-28 14:16:12,437 - mmdet - INFO - Epoch [3][3800/7330] lr: 1.000e-04, eta: 17:56:23, time: 0.971, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0537, loss_cls: 0.2233, acc: 92.1499, loss_bbox: 0.2706, loss_mask: 0.2572, loss: 0.8286 2024-05-28 14:16:58,185 - mmdet - INFO - Epoch [3][3850/7330] lr: 1.000e-04, eta: 17:55:34, time: 0.915, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0507, loss_cls: 0.2098, acc: 92.4006, loss_bbox: 0.2625, loss_mask: 0.2551, loss: 0.8025 2024-05-28 14:17:46,190 - mmdet - INFO - Epoch [3][3900/7330] lr: 1.000e-04, eta: 17:54:53, time: 0.960, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0528, loss_cls: 0.2182, acc: 92.3506, loss_bbox: 0.2710, loss_mask: 0.2635, loss: 0.8320 2024-05-28 14:18:32,611 - mmdet - INFO - Epoch [3][3950/7330] lr: 1.000e-04, eta: 17:54:07, time: 0.928, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0553, loss_cls: 0.2221, acc: 92.0947, loss_bbox: 0.2739, loss_mask: 0.2638, loss: 0.8426 2024-05-28 14:19:20,724 - mmdet - INFO - Epoch [3][4000/7330] lr: 1.000e-04, eta: 17:53:26, time: 0.962, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0500, loss_cls: 0.2095, acc: 92.5632, loss_bbox: 0.2531, loss_mask: 0.2570, loss: 0.7945 2024-05-28 14:20:12,015 - mmdet - INFO - Epoch [3][4050/7330] lr: 1.000e-04, eta: 17:52:58, time: 1.026, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0469, loss_cls: 0.2002, acc: 92.8076, loss_bbox: 0.2493, loss_mask: 0.2553, loss: 0.7761 2024-05-28 14:20:57,769 - mmdet - INFO - Epoch [3][4100/7330] lr: 1.000e-04, eta: 17:52:08, time: 0.915, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0528, loss_cls: 0.2169, acc: 92.4365, loss_bbox: 0.2604, loss_mask: 0.2548, loss: 0.8111 2024-05-28 14:21:43,988 - mmdet - INFO - Epoch [3][4150/7330] lr: 1.000e-04, eta: 17:51:21, time: 0.924, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0512, loss_cls: 0.2126, acc: 92.5298, loss_bbox: 0.2594, loss_mask: 0.2567, loss: 0.8042 2024-05-28 14:22:29,455 - mmdet - INFO - Epoch [3][4200/7330] lr: 1.000e-04, eta: 17:50:31, time: 0.909, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0503, loss_cls: 0.2120, acc: 92.4832, loss_bbox: 0.2596, loss_mask: 0.2570, loss: 0.8029 2024-05-28 14:23:14,849 - mmdet - INFO - Epoch [3][4250/7330] lr: 1.000e-04, eta: 17:49:40, time: 0.908, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0539, loss_cls: 0.2218, acc: 92.2092, loss_bbox: 0.2732, loss_mask: 0.2659, loss: 0.8392 2024-05-28 14:24:03,161 - mmdet - INFO - Epoch [3][4300/7330] lr: 1.000e-04, eta: 17:49:01, time: 0.966, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0501, loss_cls: 0.2189, acc: 92.3237, loss_bbox: 0.2627, loss_mask: 0.2580, loss: 0.8130 2024-05-28 14:24:49,278 - mmdet - INFO - Epoch [3][4350/7330] lr: 1.000e-04, eta: 17:48:13, time: 0.922, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0529, loss_cls: 0.2181, acc: 92.2900, loss_bbox: 0.2695, loss_mask: 0.2651, loss: 0.8301 2024-05-28 14:25:34,587 - mmdet - INFO - Epoch [3][4400/7330] lr: 1.000e-04, eta: 17:47:22, time: 0.906, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0490, loss_cls: 0.2085, acc: 92.6091, loss_bbox: 0.2568, loss_mask: 0.2567, loss: 0.7957 2024-05-28 14:26:20,364 - mmdet - INFO - Epoch [3][4450/7330] lr: 1.000e-04, eta: 17:46:33, time: 0.916, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0499, loss_cls: 0.2156, acc: 92.3462, loss_bbox: 0.2639, loss_mask: 0.2567, loss: 0.8100 2024-05-28 14:27:05,690 - mmdet - INFO - Epoch [3][4500/7330] lr: 1.000e-04, eta: 17:45:43, time: 0.907, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0552, loss_cls: 0.2176, acc: 92.2061, loss_bbox: 0.2724, loss_mask: 0.2675, loss: 0.8388 2024-05-28 14:27:51,950 - mmdet - INFO - Epoch [3][4550/7330] lr: 1.000e-04, eta: 17:44:55, time: 0.925, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0546, loss_cls: 0.2213, acc: 92.1694, loss_bbox: 0.2689, loss_mask: 0.2655, loss: 0.8379 2024-05-28 14:28:38,102 - mmdet - INFO - Epoch [3][4600/7330] lr: 1.000e-04, eta: 17:44:08, time: 0.923, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0518, loss_cls: 0.2082, acc: 92.6772, loss_bbox: 0.2568, loss_mask: 0.2574, loss: 0.7972 2024-05-28 14:29:24,223 - mmdet - INFO - Epoch [3][4650/7330] lr: 1.000e-04, eta: 17:43:20, time: 0.922, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0520, loss_cls: 0.2251, acc: 92.0554, loss_bbox: 0.2776, loss_mask: 0.2632, loss: 0.8422 2024-05-28 14:30:10,015 - mmdet - INFO - Epoch [3][4700/7330] lr: 1.000e-04, eta: 17:42:31, time: 0.916, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0540, loss_cls: 0.2244, acc: 92.1792, loss_bbox: 0.2689, loss_mask: 0.2581, loss: 0.8293 2024-05-28 14:30:58,597 - mmdet - INFO - Epoch [3][4750/7330] lr: 1.000e-04, eta: 17:41:52, time: 0.972, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0497, loss_cls: 0.2094, acc: 92.6084, loss_bbox: 0.2572, loss_mask: 0.2605, loss: 0.8008 2024-05-28 14:31:44,470 - mmdet - INFO - Epoch [3][4800/7330] lr: 1.000e-04, eta: 17:41:04, time: 0.917, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0505, loss_cls: 0.2198, acc: 92.2351, loss_bbox: 0.2737, loss_mask: 0.2564, loss: 0.8232 2024-05-28 14:32:35,554 - mmdet - INFO - Epoch [3][4850/7330] lr: 1.000e-04, eta: 17:40:33, time: 1.022, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0516, loss_cls: 0.2225, acc: 92.1067, loss_bbox: 0.2724, loss_mask: 0.2608, loss: 0.8339 2024-05-28 14:33:21,227 - mmdet - INFO - Epoch [3][4900/7330] lr: 1.000e-04, eta: 17:39:44, time: 0.913, data_time: 0.063, memory: 20925, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0505, loss_cls: 0.2124, acc: 92.4744, loss_bbox: 0.2614, loss_mask: 0.2568, loss: 0.8043 2024-05-28 14:34:09,896 - mmdet - INFO - Epoch [3][4950/7330] lr: 1.000e-04, eta: 17:39:05, time: 0.974, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0493, loss_cls: 0.2198, acc: 92.1450, loss_bbox: 0.2710, loss_mask: 0.2613, loss: 0.8256 2024-05-28 14:34:55,791 - mmdet - INFO - Epoch [3][5000/7330] lr: 1.000e-04, eta: 17:38:17, time: 0.918, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0500, loss_cls: 0.2044, acc: 92.8210, loss_bbox: 0.2557, loss_mask: 0.2581, loss: 0.7915 2024-05-28 14:35:44,209 - mmdet - INFO - Epoch [3][5050/7330] lr: 1.000e-04, eta: 17:37:37, time: 0.968, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0515, loss_cls: 0.2154, acc: 92.3262, loss_bbox: 0.2698, loss_mask: 0.2559, loss: 0.8150 2024-05-28 14:36:32,995 - mmdet - INFO - Epoch [3][5100/7330] lr: 1.000e-04, eta: 17:36:58, time: 0.976, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0512, loss_cls: 0.2162, acc: 92.3315, loss_bbox: 0.2673, loss_mask: 0.2596, loss: 0.8190 2024-05-28 14:37:21,382 - mmdet - INFO - Epoch [3][5150/7330] lr: 1.000e-04, eta: 17:36:18, time: 0.967, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0498, loss_cls: 0.2078, acc: 92.6152, loss_bbox: 0.2536, loss_mask: 0.2576, loss: 0.7949 2024-05-28 14:38:07,215 - mmdet - INFO - Epoch [3][5200/7330] lr: 1.000e-04, eta: 17:35:29, time: 0.917, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0499, loss_cls: 0.2144, acc: 92.5208, loss_bbox: 0.2669, loss_mask: 0.2551, loss: 0.8094 2024-05-28 14:38:53,099 - mmdet - INFO - Epoch [3][5250/7330] lr: 1.000e-04, eta: 17:34:41, time: 0.918, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0502, loss_cls: 0.2133, acc: 92.5200, loss_bbox: 0.2581, loss_mask: 0.2562, loss: 0.8017 2024-05-28 14:39:38,990 - mmdet - INFO - Epoch [3][5300/7330] lr: 1.000e-04, eta: 17:33:52, time: 0.918, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0499, loss_cls: 0.2127, acc: 92.4478, loss_bbox: 0.2588, loss_mask: 0.2564, loss: 0.8002 2024-05-28 14:40:24,564 - mmdet - INFO - Epoch [3][5350/7330] lr: 1.000e-04, eta: 17:33:03, time: 0.911, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0496, loss_cls: 0.2092, acc: 92.6863, loss_bbox: 0.2596, loss_mask: 0.2476, loss: 0.7895 2024-05-28 14:41:13,045 - mmdet - INFO - Epoch [3][5400/7330] lr: 1.000e-04, eta: 17:32:23, time: 0.970, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0521, loss_cls: 0.2251, acc: 91.9512, loss_bbox: 0.2741, loss_mask: 0.2643, loss: 0.8402 2024-05-28 14:41:59,591 - mmdet - INFO - Epoch [3][5450/7330] lr: 1.000e-04, eta: 17:31:37, time: 0.931, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0502, loss_cls: 0.2209, acc: 92.1982, loss_bbox: 0.2674, loss_mask: 0.2571, loss: 0.8225 2024-05-28 14:42:46,097 - mmdet - INFO - Epoch [3][5500/7330] lr: 1.000e-04, eta: 17:30:50, time: 0.930, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0526, loss_cls: 0.2248, acc: 91.9736, loss_bbox: 0.2768, loss_mask: 0.2646, loss: 0.8435 2024-05-28 14:43:32,155 - mmdet - INFO - Epoch [3][5550/7330] lr: 1.000e-04, eta: 17:30:02, time: 0.921, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0474, loss_cls: 0.2050, acc: 92.6201, loss_bbox: 0.2582, loss_mask: 0.2526, loss: 0.7872 2024-05-28 14:44:17,587 - mmdet - INFO - Epoch [3][5600/7330] lr: 1.000e-04, eta: 17:29:12, time: 0.909, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0519, loss_cls: 0.2130, acc: 92.3550, loss_bbox: 0.2654, loss_mask: 0.2522, loss: 0.8070 2024-05-28 14:45:03,717 - mmdet - INFO - Epoch [3][5650/7330] lr: 1.000e-04, eta: 17:28:24, time: 0.923, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0499, loss_cls: 0.2132, acc: 92.4653, loss_bbox: 0.2595, loss_mask: 0.2524, loss: 0.8004 2024-05-28 14:45:49,909 - mmdet - INFO - Epoch [3][5700/7330] lr: 1.000e-04, eta: 17:27:37, time: 0.924, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0518, loss_cls: 0.2222, acc: 92.1343, loss_bbox: 0.2708, loss_mask: 0.2565, loss: 0.8259 2024-05-28 14:46:35,582 - mmdet - INFO - Epoch [3][5750/7330] lr: 1.000e-04, eta: 17:26:48, time: 0.913, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0509, loss_cls: 0.2186, acc: 92.2759, loss_bbox: 0.2686, loss_mask: 0.2611, loss: 0.8231 2024-05-28 14:47:21,195 - mmdet - INFO - Epoch [3][5800/7330] lr: 1.000e-04, eta: 17:25:58, time: 0.912, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0529, loss_cls: 0.2209, acc: 92.1958, loss_bbox: 0.2683, loss_mask: 0.2614, loss: 0.8275 2024-05-28 14:48:09,717 - mmdet - INFO - Epoch [3][5850/7330] lr: 1.000e-04, eta: 17:25:18, time: 0.970, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0482, loss_cls: 0.2106, acc: 92.5312, loss_bbox: 0.2582, loss_mask: 0.2614, loss: 0.8009 2024-05-28 14:49:00,062 - mmdet - INFO - Epoch [3][5900/7330] lr: 1.000e-04, eta: 17:24:45, time: 1.007, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0519, loss_cls: 0.2174, acc: 92.3489, loss_bbox: 0.2614, loss_mask: 0.2576, loss: 0.8139 2024-05-28 14:49:47,303 - mmdet - INFO - Epoch [3][5950/7330] lr: 1.000e-04, eta: 17:24:01, time: 0.945, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0281, loss_rpn_bbox: 0.0558, loss_cls: 0.2208, acc: 92.1140, loss_bbox: 0.2725, loss_mask: 0.2643, loss: 0.8415 2024-05-28 14:50:36,174 - mmdet - INFO - Epoch [3][6000/7330] lr: 1.000e-04, eta: 17:23:22, time: 0.977, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0508, loss_cls: 0.2114, acc: 92.4668, loss_bbox: 0.2627, loss_mask: 0.2562, loss: 0.8067 2024-05-28 14:51:22,470 - mmdet - INFO - Epoch [3][6050/7330] lr: 1.000e-04, eta: 17:22:35, time: 0.926, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0517, loss_cls: 0.2125, acc: 92.4673, loss_bbox: 0.2611, loss_mask: 0.2552, loss: 0.8058 2024-05-28 14:52:08,991 - mmdet - INFO - Epoch [3][6100/7330] lr: 1.000e-04, eta: 17:21:48, time: 0.930, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0511, loss_cls: 0.2136, acc: 92.3728, loss_bbox: 0.2637, loss_mask: 0.2588, loss: 0.8116 2024-05-28 14:52:59,933 - mmdet - INFO - Epoch [3][6150/7330] lr: 1.000e-04, eta: 17:21:16, time: 1.019, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0495, loss_cls: 0.2171, acc: 92.3521, loss_bbox: 0.2623, loss_mask: 0.2581, loss: 0.8128 2024-05-28 14:53:48,283 - mmdet - INFO - Epoch [3][6200/7330] lr: 1.000e-04, eta: 17:20:35, time: 0.967, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0501, loss_cls: 0.2143, acc: 92.3970, loss_bbox: 0.2578, loss_mask: 0.2484, loss: 0.7952 2024-05-28 14:54:34,560 - mmdet - INFO - Epoch [3][6250/7330] lr: 1.000e-04, eta: 17:19:48, time: 0.926, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0472, loss_cls: 0.2061, acc: 92.7539, loss_bbox: 0.2506, loss_mask: 0.2526, loss: 0.7786 2024-05-28 14:55:20,339 - mmdet - INFO - Epoch [3][6300/7330] lr: 1.000e-04, eta: 17:18:59, time: 0.916, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0504, loss_cls: 0.2087, acc: 92.6936, loss_bbox: 0.2601, loss_mask: 0.2549, loss: 0.7987 2024-05-28 14:56:05,967 - mmdet - INFO - Epoch [3][6350/7330] lr: 1.000e-04, eta: 17:18:10, time: 0.913, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0498, loss_cls: 0.2112, acc: 92.4324, loss_bbox: 0.2663, loss_mask: 0.2577, loss: 0.8071 2024-05-28 14:56:52,269 - mmdet - INFO - Epoch [3][6400/7330] lr: 1.000e-04, eta: 17:17:22, time: 0.926, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0497, loss_cls: 0.2176, acc: 92.2864, loss_bbox: 0.2648, loss_mask: 0.2627, loss: 0.8197 2024-05-28 14:57:41,970 - mmdet - INFO - Epoch [3][6450/7330] lr: 1.000e-04, eta: 17:16:46, time: 0.994, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0544, loss_cls: 0.2157, acc: 92.3613, loss_bbox: 0.2637, loss_mask: 0.2577, loss: 0.8178 2024-05-28 14:58:28,412 - mmdet - INFO - Epoch [3][6500/7330] lr: 1.000e-04, eta: 17:15:59, time: 0.929, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0517, loss_cls: 0.2212, acc: 92.1719, loss_bbox: 0.2658, loss_mask: 0.2584, loss: 0.8213 2024-05-28 14:59:14,267 - mmdet - INFO - Epoch [3][6550/7330] lr: 1.000e-04, eta: 17:15:11, time: 0.917, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0507, loss_cls: 0.2101, acc: 92.5676, loss_bbox: 0.2621, loss_mask: 0.2547, loss: 0.8015 2024-05-28 14:59:59,799 - mmdet - INFO - Epoch [3][6600/7330] lr: 1.000e-04, eta: 17:14:21, time: 0.911, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0483, loss_cls: 0.2107, acc: 92.5972, loss_bbox: 0.2557, loss_mask: 0.2551, loss: 0.7928 2024-05-28 15:00:45,797 - mmdet - INFO - Epoch [3][6650/7330] lr: 1.000e-04, eta: 17:13:33, time: 0.920, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0520, loss_cls: 0.2247, acc: 91.9988, loss_bbox: 0.2734, loss_mask: 0.2589, loss: 0.8325 2024-05-28 15:01:31,870 - mmdet - INFO - Epoch [3][6700/7330] lr: 1.000e-04, eta: 17:12:45, time: 0.922, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0501, loss_cls: 0.2218, acc: 92.3979, loss_bbox: 0.2653, loss_mask: 0.2633, loss: 0.8261 2024-05-28 15:02:17,403 - mmdet - INFO - Epoch [3][6750/7330] lr: 1.000e-04, eta: 17:11:55, time: 0.911, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0484, loss_cls: 0.2118, acc: 92.4292, loss_bbox: 0.2602, loss_mask: 0.2574, loss: 0.8010 2024-05-28 15:03:02,835 - mmdet - INFO - Epoch [3][6800/7330] lr: 1.000e-04, eta: 17:11:05, time: 0.909, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0468, loss_cls: 0.1997, acc: 92.9622, loss_bbox: 0.2399, loss_mask: 0.2507, loss: 0.7601 2024-05-28 15:03:49,236 - mmdet - INFO - Epoch [3][6850/7330] lr: 1.000e-04, eta: 17:10:18, time: 0.928, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0498, loss_cls: 0.2050, acc: 92.6365, loss_bbox: 0.2547, loss_mask: 0.2514, loss: 0.7844 2024-05-28 15:04:38,049 - mmdet - INFO - Epoch [3][6900/7330] lr: 1.000e-04, eta: 17:09:39, time: 0.976, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0486, loss_cls: 0.2126, acc: 92.4421, loss_bbox: 0.2646, loss_mask: 0.2601, loss: 0.8091 2024-05-28 15:05:23,837 - mmdet - INFO - Epoch [3][6950/7330] lr: 1.000e-04, eta: 17:08:50, time: 0.916, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0481, loss_cls: 0.2074, acc: 92.5474, loss_bbox: 0.2606, loss_mask: 0.2595, loss: 0.8008 2024-05-28 15:06:14,638 - mmdet - INFO - Epoch [3][7000/7330] lr: 1.000e-04, eta: 17:08:17, time: 1.016, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0518, loss_cls: 0.2202, acc: 92.3474, loss_bbox: 0.2636, loss_mask: 0.2545, loss: 0.8154 2024-05-28 15:07:01,195 - mmdet - INFO - Epoch [3][7050/7330] lr: 1.000e-04, eta: 17:07:30, time: 0.931, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0526, loss_cls: 0.2303, acc: 91.9568, loss_bbox: 0.2751, loss_mask: 0.2620, loss: 0.8473 2024-05-28 15:07:50,163 - mmdet - INFO - Epoch [3][7100/7330] lr: 1.000e-04, eta: 17:06:51, time: 0.979, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0502, loss_cls: 0.2132, acc: 92.4714, loss_bbox: 0.2637, loss_mask: 0.2553, loss: 0.8087 2024-05-28 15:08:36,640 - mmdet - INFO - Epoch [3][7150/7330] lr: 1.000e-04, eta: 17:06:05, time: 0.930, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0491, loss_cls: 0.2066, acc: 92.5396, loss_bbox: 0.2607, loss_mask: 0.2504, loss: 0.7901 2024-05-28 15:09:25,415 - mmdet - INFO - Epoch [3][7200/7330] lr: 1.000e-04, eta: 17:05:25, time: 0.975, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0492, loss_cls: 0.2103, acc: 92.4399, loss_bbox: 0.2637, loss_mask: 0.2592, loss: 0.8059 2024-05-28 15:10:13,558 - mmdet - INFO - Epoch [3][7250/7330] lr: 1.000e-04, eta: 17:04:43, time: 0.963, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0493, loss_cls: 0.2129, acc: 92.5247, loss_bbox: 0.2596, loss_mask: 0.2549, loss: 0.8006 2024-05-28 15:11:01,946 - mmdet - INFO - Epoch [3][7300/7330] lr: 1.000e-04, eta: 17:04:02, time: 0.968, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0499, loss_cls: 0.2109, acc: 92.3955, loss_bbox: 0.2612, loss_mask: 0.2557, loss: 0.8014 2024-05-28 15:11:30,078 - mmdet - INFO - Saving checkpoint at 3 epochs 2024-05-28 15:13:42,411 - mmdet - INFO - Evaluating bbox... 2024-05-28 15:14:09,867 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.424 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.671 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.464 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.260 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.473 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.570 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.353 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.597 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.700 2024-05-28 15:14:09,867 - mmdet - INFO - Evaluating segm... 2024-05-28 15:14:38,381 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.388 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.632 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.410 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.181 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.426 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.599 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.502 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.502 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.502 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.286 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.555 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.687 2024-05-28 15:14:38,759 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 15:14:38,760 - mmdet - INFO - Epoch(val) [3][625] bbox_mAP: 0.4240, bbox_mAP_50: 0.6710, bbox_mAP_75: 0.4640, bbox_mAP_s: 0.2600, bbox_mAP_m: 0.4730, bbox_mAP_l: 0.5700, bbox_mAP_copypaste: 0.424 0.671 0.464 0.260 0.473 0.570, segm_mAP: 0.3880, segm_mAP_50: 0.6320, segm_mAP_75: 0.4100, segm_mAP_s: 0.1810, segm_mAP_m: 0.4260, segm_mAP_l: 0.5990, segm_mAP_copypaste: 0.388 0.632 0.410 0.181 0.426 0.599 2024-05-28 15:15:30,609 - mmdet - INFO - Epoch [4][50/7330] lr: 1.000e-04, eta: 17:01:40, time: 1.036, data_time: 0.124, memory: 20925, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0482, loss_cls: 0.1993, acc: 92.7148, loss_bbox: 0.2489, loss_mask: 0.2457, loss: 0.7630 2024-05-28 15:16:16,216 - mmdet - INFO - Epoch [4][100/7330] lr: 1.000e-04, eta: 17:00:51, time: 0.912, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0470, loss_cls: 0.2045, acc: 92.7000, loss_bbox: 0.2524, loss_mask: 0.2474, loss: 0.7732 2024-05-28 15:17:02,289 - mmdet - INFO - Epoch [4][150/7330] lr: 1.000e-04, eta: 17:00:03, time: 0.922, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0495, loss_cls: 0.2051, acc: 92.7446, loss_bbox: 0.2526, loss_mask: 0.2549, loss: 0.7857 2024-05-28 15:17:48,274 - mmdet - INFO - Epoch [4][200/7330] lr: 1.000e-04, eta: 16:59:15, time: 0.920, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0510, loss_cls: 0.2037, acc: 92.6843, loss_bbox: 0.2568, loss_mask: 0.2548, loss: 0.7898 2024-05-28 15:18:34,552 - mmdet - INFO - Epoch [4][250/7330] lr: 1.000e-04, eta: 16:58:28, time: 0.926, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0511, loss_cls: 0.2025, acc: 92.7693, loss_bbox: 0.2517, loss_mask: 0.2489, loss: 0.7780 2024-05-28 15:19:20,908 - mmdet - INFO - Epoch [4][300/7330] lr: 1.000e-04, eta: 16:57:41, time: 0.927, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0473, loss_cls: 0.1983, acc: 92.8599, loss_bbox: 0.2453, loss_mask: 0.2572, loss: 0.7703 2024-05-28 15:20:07,918 - mmdet - INFO - Epoch [4][350/7330] lr: 1.000e-04, eta: 16:56:56, time: 0.940, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0529, loss_cls: 0.2131, acc: 92.4270, loss_bbox: 0.2665, loss_mask: 0.2554, loss: 0.8115 2024-05-28 15:20:53,912 - mmdet - INFO - Epoch [4][400/7330] lr: 1.000e-04, eta: 16:56:08, time: 0.920, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0510, loss_cls: 0.2064, acc: 92.5872, loss_bbox: 0.2534, loss_mask: 0.2538, loss: 0.7891 2024-05-28 15:21:39,699 - mmdet - INFO - Epoch [4][450/7330] lr: 1.000e-04, eta: 16:55:19, time: 0.916, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0500, loss_cls: 0.2097, acc: 92.5508, loss_bbox: 0.2591, loss_mask: 0.2581, loss: 0.7992 2024-05-28 15:22:25,609 - mmdet - INFO - Epoch [4][500/7330] lr: 1.000e-04, eta: 16:54:31, time: 0.918, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0523, loss_cls: 0.2237, acc: 92.0271, loss_bbox: 0.2729, loss_mask: 0.2565, loss: 0.8285 2024-05-28 15:23:11,025 - mmdet - INFO - Epoch [4][550/7330] lr: 1.000e-04, eta: 16:53:42, time: 0.908, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0481, loss_cls: 0.2016, acc: 92.7268, loss_bbox: 0.2496, loss_mask: 0.2491, loss: 0.7685 2024-05-28 15:23:56,726 - mmdet - INFO - Epoch [4][600/7330] lr: 1.000e-04, eta: 16:52:53, time: 0.914, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0473, loss_cls: 0.2056, acc: 92.6201, loss_bbox: 0.2555, loss_mask: 0.2459, loss: 0.7739 2024-05-28 15:24:42,873 - mmdet - INFO - Epoch [4][650/7330] lr: 1.000e-04, eta: 16:52:05, time: 0.923, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0492, loss_cls: 0.2075, acc: 92.5344, loss_bbox: 0.2570, loss_mask: 0.2552, loss: 0.7912 2024-05-28 15:25:34,220 - mmdet - INFO - Epoch [4][700/7330] lr: 1.000e-04, eta: 16:51:33, time: 1.027, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0481, loss_cls: 0.2073, acc: 92.5659, loss_bbox: 0.2562, loss_mask: 0.2490, loss: 0.7827 2024-05-28 15:26:19,482 - mmdet - INFO - Epoch [4][750/7330] lr: 1.000e-04, eta: 16:50:43, time: 0.905, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0471, loss_cls: 0.1958, acc: 92.8713, loss_bbox: 0.2479, loss_mask: 0.2481, loss: 0.7605 2024-05-28 15:27:05,485 - mmdet - INFO - Epoch [4][800/7330] lr: 1.000e-04, eta: 16:49:55, time: 0.920, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0503, loss_cls: 0.1988, acc: 92.7954, loss_bbox: 0.2490, loss_mask: 0.2487, loss: 0.7686 2024-05-28 15:27:54,275 - mmdet - INFO - Epoch [4][850/7330] lr: 1.000e-04, eta: 16:49:15, time: 0.976, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0463, loss_cls: 0.1975, acc: 92.8010, loss_bbox: 0.2523, loss_mask: 0.2506, loss: 0.7698 2024-05-28 15:28:41,231 - mmdet - INFO - Epoch [4][900/7330] lr: 1.000e-04, eta: 16:48:30, time: 0.939, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0515, loss_cls: 0.2048, acc: 92.6248, loss_bbox: 0.2555, loss_mask: 0.2535, loss: 0.7884 2024-05-28 15:29:29,808 - mmdet - INFO - Epoch [4][950/7330] lr: 1.000e-04, eta: 16:47:49, time: 0.971, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0521, loss_cls: 0.2078, acc: 92.4719, loss_bbox: 0.2568, loss_mask: 0.2533, loss: 0.7937 2024-05-28 15:30:16,318 - mmdet - INFO - Epoch [4][1000/7330] lr: 1.000e-04, eta: 16:47:03, time: 0.930, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0506, loss_cls: 0.2120, acc: 92.3530, loss_bbox: 0.2656, loss_mask: 0.2536, loss: 0.8052 2024-05-28 15:31:02,289 - mmdet - INFO - Epoch [4][1050/7330] lr: 1.000e-04, eta: 16:46:15, time: 0.919, data_time: 0.063, memory: 20925, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0479, loss_cls: 0.2022, acc: 92.6113, loss_bbox: 0.2581, loss_mask: 0.2521, loss: 0.7809 2024-05-28 15:31:48,126 - mmdet - INFO - Epoch [4][1100/7330] lr: 1.000e-04, eta: 16:45:26, time: 0.917, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0496, loss_cls: 0.2051, acc: 92.6482, loss_bbox: 0.2561, loss_mask: 0.2499, loss: 0.7837 2024-05-28 15:32:36,435 - mmdet - INFO - Epoch [4][1150/7330] lr: 1.000e-04, eta: 16:44:45, time: 0.966, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0502, loss_cls: 0.2083, acc: 92.6028, loss_bbox: 0.2571, loss_mask: 0.2540, loss: 0.7909 2024-05-28 15:33:22,178 - mmdet - INFO - Epoch [4][1200/7330] lr: 1.000e-04, eta: 16:43:56, time: 0.915, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0482, loss_cls: 0.1984, acc: 92.8716, loss_bbox: 0.2495, loss_mask: 0.2528, loss: 0.7688 2024-05-28 15:34:11,630 - mmdet - INFO - Epoch [4][1250/7330] lr: 1.000e-04, eta: 16:43:18, time: 0.989, data_time: 0.065, memory: 20925, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0489, loss_cls: 0.1998, acc: 92.6638, loss_bbox: 0.2589, loss_mask: 0.2511, loss: 0.7822 2024-05-28 15:35:00,331 - mmdet - INFO - Epoch [4][1300/7330] lr: 1.000e-04, eta: 16:42:38, time: 0.974, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0536, loss_cls: 0.2044, acc: 92.6787, loss_bbox: 0.2556, loss_mask: 0.2513, loss: 0.7897 2024-05-28 15:35:45,950 - mmdet - INFO - Epoch [4][1350/7330] lr: 1.000e-04, eta: 16:41:49, time: 0.912, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0468, loss_cls: 0.2021, acc: 92.8008, loss_bbox: 0.2484, loss_mask: 0.2424, loss: 0.7610 2024-05-28 15:36:32,685 - mmdet - INFO - Epoch [4][1400/7330] lr: 1.000e-04, eta: 16:41:03, time: 0.935, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0499, loss_cls: 0.2067, acc: 92.5339, loss_bbox: 0.2653, loss_mask: 0.2587, loss: 0.8037 2024-05-28 15:37:21,076 - mmdet - INFO - Epoch [4][1450/7330] lr: 1.000e-04, eta: 16:40:22, time: 0.968, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0464, loss_cls: 0.1920, acc: 93.0710, loss_bbox: 0.2443, loss_mask: 0.2504, loss: 0.7540 2024-05-28 15:38:08,166 - mmdet - INFO - Epoch [4][1500/7330] lr: 1.000e-04, eta: 16:39:37, time: 0.942, data_time: 0.064, memory: 20925, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0505, loss_cls: 0.2158, acc: 92.4658, loss_bbox: 0.2594, loss_mask: 0.2533, loss: 0.8029 2024-05-28 15:38:54,101 - mmdet - INFO - Epoch [4][1550/7330] lr: 1.000e-04, eta: 16:38:48, time: 0.919, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0478, loss_cls: 0.2042, acc: 92.7319, loss_bbox: 0.2549, loss_mask: 0.2465, loss: 0.7778 2024-05-28 15:39:39,974 - mmdet - INFO - Epoch [4][1600/7330] lr: 1.000e-04, eta: 16:38:00, time: 0.917, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0461, loss_cls: 0.1975, acc: 93.0762, loss_bbox: 0.2416, loss_mask: 0.2469, loss: 0.7538 2024-05-28 15:40:26,085 - mmdet - INFO - Epoch [4][1650/7330] lr: 1.000e-04, eta: 16:37:13, time: 0.922, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0487, loss_cls: 0.2041, acc: 92.8164, loss_bbox: 0.2531, loss_mask: 0.2544, loss: 0.7819 2024-05-28 15:41:11,844 - mmdet - INFO - Epoch [4][1700/7330] lr: 1.000e-04, eta: 16:36:24, time: 0.915, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0483, loss_cls: 0.2023, acc: 92.5850, loss_bbox: 0.2595, loss_mask: 0.2510, loss: 0.7823 2024-05-28 15:42:01,942 - mmdet - INFO - Epoch [4][1750/7330] lr: 1.000e-04, eta: 16:35:47, time: 1.002, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0478, loss_cls: 0.1966, acc: 92.9268, loss_bbox: 0.2441, loss_mask: 0.2424, loss: 0.7525 2024-05-28 15:42:48,327 - mmdet - INFO - Epoch [4][1800/7330] lr: 1.000e-04, eta: 16:35:00, time: 0.928, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0489, loss_cls: 0.2083, acc: 92.4995, loss_bbox: 0.2621, loss_mask: 0.2542, loss: 0.7965 2024-05-28 15:43:34,037 - mmdet - INFO - Epoch [4][1850/7330] lr: 1.000e-04, eta: 16:34:12, time: 0.914, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0505, loss_cls: 0.2060, acc: 92.5549, loss_bbox: 0.2590, loss_mask: 0.2516, loss: 0.7897 2024-05-28 15:44:22,891 - mmdet - INFO - Epoch [4][1900/7330] lr: 1.000e-04, eta: 16:33:31, time: 0.977, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0544, loss_cls: 0.2192, acc: 92.0620, loss_bbox: 0.2743, loss_mask: 0.2577, loss: 0.8288 2024-05-28 15:45:09,101 - mmdet - INFO - Epoch [4][1950/7330] lr: 1.000e-04, eta: 16:32:44, time: 0.924, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0452, loss_cls: 0.1952, acc: 93.0823, loss_bbox: 0.2424, loss_mask: 0.2456, loss: 0.7486 2024-05-28 15:45:56,951 - mmdet - INFO - Epoch [4][2000/7330] lr: 1.000e-04, eta: 16:32:01, time: 0.957, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0476, loss_cls: 0.2031, acc: 92.7183, loss_bbox: 0.2540, loss_mask: 0.2484, loss: 0.7749 2024-05-28 15:46:43,217 - mmdet - INFO - Epoch [4][2050/7330] lr: 1.000e-04, eta: 16:31:14, time: 0.925, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0493, loss_cls: 0.2029, acc: 92.7227, loss_bbox: 0.2559, loss_mask: 0.2527, loss: 0.7838 2024-05-28 15:47:29,023 - mmdet - INFO - Epoch [4][2100/7330] lr: 1.000e-04, eta: 16:30:25, time: 0.916, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0506, loss_cls: 0.2097, acc: 92.4871, loss_bbox: 0.2580, loss_mask: 0.2558, loss: 0.7975 2024-05-28 15:48:14,625 - mmdet - INFO - Epoch [4][2150/7330] lr: 1.000e-04, eta: 16:29:37, time: 0.912, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0500, loss_cls: 0.2065, acc: 92.4919, loss_bbox: 0.2621, loss_mask: 0.2613, loss: 0.8041 2024-05-28 15:49:02,971 - mmdet - INFO - Epoch [4][2200/7330] lr: 1.000e-04, eta: 16:28:55, time: 0.967, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0488, loss_cls: 0.1966, acc: 92.9368, loss_bbox: 0.2452, loss_mask: 0.2501, loss: 0.7619 2024-05-28 15:49:49,080 - mmdet - INFO - Epoch [4][2250/7330] lr: 1.000e-04, eta: 16:28:07, time: 0.922, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0483, loss_cls: 0.2069, acc: 92.5637, loss_bbox: 0.2562, loss_mask: 0.2482, loss: 0.7818 2024-05-28 15:50:39,221 - mmdet - INFO - Epoch [4][2300/7330] lr: 1.000e-04, eta: 16:27:30, time: 1.003, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0492, loss_cls: 0.2104, acc: 92.4714, loss_bbox: 0.2617, loss_mask: 0.2504, loss: 0.7938 2024-05-28 15:51:28,171 - mmdet - INFO - Epoch [4][2350/7330] lr: 1.000e-04, eta: 16:26:50, time: 0.979, data_time: 0.067, memory: 20925, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0492, loss_cls: 0.2043, acc: 92.7507, loss_bbox: 0.2520, loss_mask: 0.2469, loss: 0.7745 2024-05-28 15:52:13,185 - mmdet - INFO - Epoch [4][2400/7330] lr: 1.000e-04, eta: 16:25:59, time: 0.900, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0483, loss_cls: 0.1963, acc: 92.9150, loss_bbox: 0.2469, loss_mask: 0.2489, loss: 0.7610 2024-05-28 15:52:59,390 - mmdet - INFO - Epoch [4][2450/7330] lr: 1.000e-04, eta: 16:25:12, time: 0.924, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0514, loss_cls: 0.2044, acc: 92.6257, loss_bbox: 0.2543, loss_mask: 0.2478, loss: 0.7800 2024-05-28 15:53:47,222 - mmdet - INFO - Epoch [4][2500/7330] lr: 1.000e-04, eta: 16:24:29, time: 0.957, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0482, loss_cls: 0.2069, acc: 92.6545, loss_bbox: 0.2552, loss_mask: 0.2560, loss: 0.7886 2024-05-28 15:54:33,599 - mmdet - INFO - Epoch [4][2550/7330] lr: 1.000e-04, eta: 16:23:42, time: 0.928, data_time: 0.068, memory: 20925, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0517, loss_cls: 0.2098, acc: 92.4551, loss_bbox: 0.2606, loss_mask: 0.2527, loss: 0.7998 2024-05-28 15:55:20,339 - mmdet - INFO - Epoch [4][2600/7330] lr: 1.000e-04, eta: 16:22:56, time: 0.935, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0524, loss_cls: 0.2138, acc: 92.3350, loss_bbox: 0.2648, loss_mask: 0.2545, loss: 0.8089 2024-05-28 15:56:05,978 - mmdet - INFO - Epoch [4][2650/7330] lr: 1.000e-04, eta: 16:22:07, time: 0.913, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0441, loss_cls: 0.2025, acc: 92.8293, loss_bbox: 0.2482, loss_mask: 0.2476, loss: 0.7642 2024-05-28 15:56:51,833 - mmdet - INFO - Epoch [4][2700/7330] lr: 1.000e-04, eta: 16:21:19, time: 0.917, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0475, loss_cls: 0.1946, acc: 93.0354, loss_bbox: 0.2440, loss_mask: 0.2413, loss: 0.7491 2024-05-28 15:57:37,923 - mmdet - INFO - Epoch [4][2750/7330] lr: 1.000e-04, eta: 16:20:31, time: 0.922, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0484, loss_cls: 0.2109, acc: 92.3972, loss_bbox: 0.2601, loss_mask: 0.2517, loss: 0.7926 2024-05-28 15:58:28,382 - mmdet - INFO - Epoch [4][2800/7330] lr: 1.000e-04, eta: 16:19:55, time: 1.009, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0487, loss_cls: 0.2003, acc: 92.6372, loss_bbox: 0.2555, loss_mask: 0.2506, loss: 0.7773 2024-05-28 15:59:14,164 - mmdet - INFO - Epoch [4][2850/7330] lr: 1.000e-04, eta: 16:19:06, time: 0.916, data_time: 0.064, memory: 20925, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0497, loss_cls: 0.1977, acc: 92.8105, loss_bbox: 0.2505, loss_mask: 0.2484, loss: 0.7691 2024-05-28 15:59:59,958 - mmdet - INFO - Epoch [4][2900/7330] lr: 1.000e-04, eta: 16:18:18, time: 0.916, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0495, loss_cls: 0.2084, acc: 92.5405, loss_bbox: 0.2544, loss_mask: 0.2497, loss: 0.7854 2024-05-28 16:00:50,319 - mmdet - INFO - Epoch [4][2950/7330] lr: 1.000e-04, eta: 16:17:41, time: 1.007, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0506, loss_cls: 0.2138, acc: 92.3091, loss_bbox: 0.2625, loss_mask: 0.2539, loss: 0.8035 2024-05-28 16:01:36,670 - mmdet - INFO - Epoch [4][3000/7330] lr: 1.000e-04, eta: 16:16:54, time: 0.927, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0489, loss_cls: 0.2005, acc: 92.8066, loss_bbox: 0.2482, loss_mask: 0.2471, loss: 0.7691 2024-05-28 16:02:21,775 - mmdet - INFO - Epoch [4][3050/7330] lr: 1.000e-04, eta: 16:16:04, time: 0.902, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0461, loss_cls: 0.1957, acc: 92.9929, loss_bbox: 0.2477, loss_mask: 0.2504, loss: 0.7593 2024-05-28 16:03:09,943 - mmdet - INFO - Epoch [4][3100/7330] lr: 1.000e-04, eta: 16:15:21, time: 0.963, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0479, loss_cls: 0.2004, acc: 92.7925, loss_bbox: 0.2506, loss_mask: 0.2520, loss: 0.7729 2024-05-28 16:03:55,477 - mmdet - INFO - Epoch [4][3150/7330] lr: 1.000e-04, eta: 16:14:32, time: 0.911, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0463, loss_cls: 0.2086, acc: 92.4824, loss_bbox: 0.2564, loss_mask: 0.2513, loss: 0.7829 2024-05-28 16:04:41,137 - mmdet - INFO - Epoch [4][3200/7330] lr: 1.000e-04, eta: 16:13:44, time: 0.913, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0473, loss_cls: 0.2015, acc: 92.7241, loss_bbox: 0.2443, loss_mask: 0.2448, loss: 0.7596 2024-05-28 16:05:29,767 - mmdet - INFO - Epoch [4][3250/7330] lr: 1.000e-04, eta: 16:13:02, time: 0.973, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0477, loss_cls: 0.2046, acc: 92.7324, loss_bbox: 0.2534, loss_mask: 0.2430, loss: 0.7714 2024-05-28 16:06:15,880 - mmdet - INFO - Epoch [4][3300/7330] lr: 1.000e-04, eta: 16:12:15, time: 0.922, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0516, loss_cls: 0.2043, acc: 92.5640, loss_bbox: 0.2537, loss_mask: 0.2497, loss: 0.7804 2024-05-28 16:07:03,818 - mmdet - INFO - Epoch [4][3350/7330] lr: 1.000e-04, eta: 16:11:32, time: 0.959, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0493, loss_cls: 0.2064, acc: 92.5496, loss_bbox: 0.2588, loss_mask: 0.2598, loss: 0.7967 2024-05-28 16:07:51,904 - mmdet - INFO - Epoch [4][3400/7330] lr: 1.000e-04, eta: 16:10:49, time: 0.962, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0476, loss_cls: 0.2010, acc: 92.8213, loss_bbox: 0.2478, loss_mask: 0.2509, loss: 0.7697 2024-05-28 16:08:37,748 - mmdet - INFO - Epoch [4][3450/7330] lr: 1.000e-04, eta: 16:10:00, time: 0.917, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0504, loss_cls: 0.2116, acc: 92.4800, loss_bbox: 0.2581, loss_mask: 0.2525, loss: 0.7949 2024-05-28 16:09:23,457 - mmdet - INFO - Epoch [4][3500/7330] lr: 1.000e-04, eta: 16:09:12, time: 0.914, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0495, loss_cls: 0.2049, acc: 92.5903, loss_bbox: 0.2507, loss_mask: 0.2446, loss: 0.7743 2024-05-28 16:10:11,633 - mmdet - INFO - Epoch [4][3550/7330] lr: 1.000e-04, eta: 16:08:29, time: 0.964, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0489, loss_cls: 0.2085, acc: 92.5583, loss_bbox: 0.2574, loss_mask: 0.2509, loss: 0.7874 2024-05-28 16:10:57,182 - mmdet - INFO - Epoch [4][3600/7330] lr: 1.000e-04, eta: 16:07:40, time: 0.911, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0497, loss_cls: 0.2046, acc: 92.6528, loss_bbox: 0.2543, loss_mask: 0.2453, loss: 0.7769 2024-05-28 16:11:43,282 - mmdet - INFO - Epoch [4][3650/7330] lr: 1.000e-04, eta: 16:06:53, time: 0.922, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0500, loss_cls: 0.2020, acc: 92.7046, loss_bbox: 0.2515, loss_mask: 0.2459, loss: 0.7711 2024-05-28 16:12:29,256 - mmdet - INFO - Epoch [4][3700/7330] lr: 1.000e-04, eta: 16:06:05, time: 0.919, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0475, loss_cls: 0.2024, acc: 92.6980, loss_bbox: 0.2552, loss_mask: 0.2469, loss: 0.7739 2024-05-28 16:13:15,278 - mmdet - INFO - Epoch [4][3750/7330] lr: 1.000e-04, eta: 16:05:17, time: 0.920, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0483, loss_cls: 0.2010, acc: 92.7317, loss_bbox: 0.2577, loss_mask: 0.2466, loss: 0.7777 2024-05-28 16:14:01,071 - mmdet - INFO - Epoch [4][3800/7330] lr: 1.000e-04, eta: 16:04:29, time: 0.916, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0473, loss_cls: 0.1948, acc: 93.1052, loss_bbox: 0.2405, loss_mask: 0.2429, loss: 0.7452 2024-05-28 16:14:51,224 - mmdet - INFO - Epoch [4][3850/7330] lr: 1.000e-04, eta: 16:03:51, time: 1.003, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0436, loss_cls: 0.1913, acc: 93.0615, loss_bbox: 0.2363, loss_mask: 0.2371, loss: 0.7285 2024-05-28 16:15:37,114 - mmdet - INFO - Epoch [4][3900/7330] lr: 1.000e-04, eta: 16:03:03, time: 0.918, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0503, loss_cls: 0.2038, acc: 92.6270, loss_bbox: 0.2564, loss_mask: 0.2477, loss: 0.7802 2024-05-28 16:16:23,321 - mmdet - INFO - Epoch [4][3950/7330] lr: 1.000e-04, eta: 16:02:15, time: 0.924, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0502, loss_cls: 0.2115, acc: 92.3555, loss_bbox: 0.2644, loss_mask: 0.2563, loss: 0.8046 2024-05-28 16:17:09,333 - mmdet - INFO - Epoch [4][4000/7330] lr: 1.000e-04, eta: 16:01:27, time: 0.920, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0445, loss_cls: 0.1981, acc: 92.9387, loss_bbox: 0.2422, loss_mask: 0.2378, loss: 0.7426 2024-05-28 16:17:57,867 - mmdet - INFO - Epoch [4][4050/7330] lr: 1.000e-04, eta: 16:00:46, time: 0.971, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0494, loss_cls: 0.2041, acc: 92.5957, loss_bbox: 0.2530, loss_mask: 0.2480, loss: 0.7771 2024-05-28 16:18:44,406 - mmdet - INFO - Epoch [4][4100/7330] lr: 1.000e-04, eta: 15:59:59, time: 0.931, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0481, loss_cls: 0.1972, acc: 92.8496, loss_bbox: 0.2465, loss_mask: 0.2458, loss: 0.7597 2024-05-28 16:19:32,582 - mmdet - INFO - Epoch [4][4150/7330] lr: 1.000e-04, eta: 15:59:16, time: 0.964, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0498, loss_cls: 0.2076, acc: 92.6003, loss_bbox: 0.2558, loss_mask: 0.2444, loss: 0.7800 2024-05-28 16:20:19,132 - mmdet - INFO - Epoch [4][4200/7330] lr: 1.000e-04, eta: 15:58:30, time: 0.931, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0499, loss_cls: 0.2111, acc: 92.2812, loss_bbox: 0.2643, loss_mask: 0.2524, loss: 0.8006 2024-05-28 16:21:04,192 - mmdet - INFO - Epoch [4][4250/7330] lr: 1.000e-04, eta: 15:57:40, time: 0.901, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0464, loss_cls: 0.1952, acc: 92.9922, loss_bbox: 0.2450, loss_mask: 0.2427, loss: 0.7505 2024-05-28 16:21:49,972 - mmdet - INFO - Epoch [4][4300/7330] lr: 1.000e-04, eta: 15:56:51, time: 0.916, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0470, loss_cls: 0.1952, acc: 92.9255, loss_bbox: 0.2401, loss_mask: 0.2372, loss: 0.7413 2024-05-28 16:22:40,132 - mmdet - INFO - Epoch [4][4350/7330] lr: 1.000e-04, eta: 15:56:13, time: 1.003, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0477, loss_cls: 0.2046, acc: 92.4744, loss_bbox: 0.2592, loss_mask: 0.2488, loss: 0.7825 2024-05-28 16:23:27,851 - mmdet - INFO - Epoch [4][4400/7330] lr: 1.000e-04, eta: 15:55:29, time: 0.954, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0480, loss_cls: 0.1947, acc: 92.9727, loss_bbox: 0.2433, loss_mask: 0.2453, loss: 0.7518 2024-05-28 16:24:16,367 - mmdet - INFO - Epoch [4][4450/7330] lr: 1.000e-04, eta: 15:54:47, time: 0.970, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0477, loss_cls: 0.1970, acc: 92.9180, loss_bbox: 0.2510, loss_mask: 0.2491, loss: 0.7661 2024-05-28 16:25:02,516 - mmdet - INFO - Epoch [4][4500/7330] lr: 1.000e-04, eta: 15:54:00, time: 0.923, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0469, loss_cls: 0.2097, acc: 92.5093, loss_bbox: 0.2573, loss_mask: 0.2504, loss: 0.7867 2024-05-28 16:25:48,896 - mmdet - INFO - Epoch [4][4550/7330] lr: 1.000e-04, eta: 15:53:13, time: 0.928, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0501, loss_cls: 0.2164, acc: 92.1648, loss_bbox: 0.2709, loss_mask: 0.2564, loss: 0.8166 2024-05-28 16:26:35,016 - mmdet - INFO - Epoch [4][4600/7330] lr: 1.000e-04, eta: 15:52:25, time: 0.922, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0481, loss_cls: 0.2033, acc: 92.6726, loss_bbox: 0.2460, loss_mask: 0.2448, loss: 0.7655 2024-05-28 16:27:23,519 - mmdet - INFO - Epoch [4][4650/7330] lr: 1.000e-04, eta: 15:51:43, time: 0.970, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0500, loss_cls: 0.2010, acc: 92.6409, loss_bbox: 0.2559, loss_mask: 0.2513, loss: 0.7791 2024-05-28 16:28:09,795 - mmdet - INFO - Epoch [4][4700/7330] lr: 1.000e-04, eta: 15:50:56, time: 0.925, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0466, loss_cls: 0.1980, acc: 92.8633, loss_bbox: 0.2501, loss_mask: 0.2417, loss: 0.7561 2024-05-28 16:28:56,109 - mmdet - INFO - Epoch [4][4750/7330] lr: 1.000e-04, eta: 15:50:09, time: 0.926, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0480, loss_cls: 0.1999, acc: 92.7012, loss_bbox: 0.2574, loss_mask: 0.2546, loss: 0.7829 2024-05-28 16:29:42,008 - mmdet - INFO - Epoch [4][4800/7330] lr: 1.000e-04, eta: 15:49:21, time: 0.918, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0485, loss_cls: 0.2073, acc: 92.6221, loss_bbox: 0.2553, loss_mask: 0.2483, loss: 0.7823 2024-05-28 16:30:28,533 - mmdet - INFO - Epoch [4][4850/7330] lr: 1.000e-04, eta: 15:48:34, time: 0.931, data_time: 0.063, memory: 20925, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0486, loss_cls: 0.1976, acc: 92.8516, loss_bbox: 0.2500, loss_mask: 0.2432, loss: 0.7602 2024-05-28 16:31:18,962 - mmdet - INFO - Epoch [4][4900/7330] lr: 1.000e-04, eta: 15:47:56, time: 1.009, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0507, loss_cls: 0.2061, acc: 92.5544, loss_bbox: 0.2568, loss_mask: 0.2462, loss: 0.7829 2024-05-28 16:32:05,020 - mmdet - INFO - Epoch [4][4950/7330] lr: 1.000e-04, eta: 15:47:09, time: 0.921, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0471, loss_cls: 0.2073, acc: 92.5391, loss_bbox: 0.2531, loss_mask: 0.2416, loss: 0.7697 2024-05-28 16:32:51,052 - mmdet - INFO - Epoch [4][5000/7330] lr: 1.000e-04, eta: 15:46:21, time: 0.921, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0502, loss_cls: 0.2065, acc: 92.4951, loss_bbox: 0.2592, loss_mask: 0.2518, loss: 0.7903 2024-05-28 16:33:37,593 - mmdet - INFO - Epoch [4][5050/7330] lr: 1.000e-04, eta: 15:45:34, time: 0.931, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0510, loss_cls: 0.2089, acc: 92.3494, loss_bbox: 0.2594, loss_mask: 0.2507, loss: 0.7923 2024-05-28 16:34:26,139 - mmdet - INFO - Epoch [4][5100/7330] lr: 1.000e-04, eta: 15:44:52, time: 0.971, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0489, loss_cls: 0.2045, acc: 92.6106, loss_bbox: 0.2530, loss_mask: 0.2511, loss: 0.7786 2024-05-28 16:35:12,651 - mmdet - INFO - Epoch [4][5150/7330] lr: 1.000e-04, eta: 15:44:06, time: 0.930, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0488, loss_cls: 0.2002, acc: 92.7251, loss_bbox: 0.2515, loss_mask: 0.2466, loss: 0.7687 2024-05-28 16:36:00,631 - mmdet - INFO - Epoch [4][5200/7330] lr: 1.000e-04, eta: 15:43:22, time: 0.960, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0487, loss_cls: 0.1985, acc: 92.7837, loss_bbox: 0.2517, loss_mask: 0.2475, loss: 0.7686 2024-05-28 16:36:46,311 - mmdet - INFO - Epoch [4][5250/7330] lr: 1.000e-04, eta: 15:42:34, time: 0.914, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0473, loss_cls: 0.2001, acc: 92.7358, loss_bbox: 0.2534, loss_mask: 0.2510, loss: 0.7734 2024-05-28 16:37:31,522 - mmdet - INFO - Epoch [4][5300/7330] lr: 1.000e-04, eta: 15:41:44, time: 0.904, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0459, loss_cls: 0.1973, acc: 92.9395, loss_bbox: 0.2460, loss_mask: 0.2455, loss: 0.7562 2024-05-28 16:38:17,853 - mmdet - INFO - Epoch [4][5350/7330] lr: 1.000e-04, eta: 15:40:57, time: 0.927, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0479, loss_cls: 0.2094, acc: 92.5149, loss_bbox: 0.2550, loss_mask: 0.2539, loss: 0.7877 2024-05-28 16:39:05,892 - mmdet - INFO - Epoch [4][5400/7330] lr: 1.000e-04, eta: 15:40:14, time: 0.961, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0460, loss_cls: 0.1912, acc: 93.2029, loss_bbox: 0.2397, loss_mask: 0.2404, loss: 0.7390 2024-05-28 16:39:55,121 - mmdet - INFO - Epoch [4][5450/7330] lr: 1.000e-04, eta: 15:39:33, time: 0.985, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0478, loss_cls: 0.1951, acc: 93.0073, loss_bbox: 0.2444, loss_mask: 0.2448, loss: 0.7548 2024-05-28 16:40:44,450 - mmdet - INFO - Epoch [4][5500/7330] lr: 1.000e-04, eta: 15:38:52, time: 0.987, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0475, loss_cls: 0.2100, acc: 92.4490, loss_bbox: 0.2613, loss_mask: 0.2524, loss: 0.7943 2024-05-28 16:41:29,912 - mmdet - INFO - Epoch [4][5550/7330] lr: 1.000e-04, eta: 15:38:03, time: 0.909, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0459, loss_cls: 0.1951, acc: 93.0312, loss_bbox: 0.2410, loss_mask: 0.2426, loss: 0.7450 2024-05-28 16:42:15,480 - mmdet - INFO - Epoch [4][5600/7330] lr: 1.000e-04, eta: 15:37:15, time: 0.911, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0483, loss_cls: 0.1980, acc: 92.9224, loss_bbox: 0.2439, loss_mask: 0.2433, loss: 0.7552 2024-05-28 16:43:01,872 - mmdet - INFO - Epoch [4][5650/7330] lr: 1.000e-04, eta: 15:36:28, time: 0.928, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0474, loss_cls: 0.2103, acc: 92.3718, loss_bbox: 0.2572, loss_mask: 0.2508, loss: 0.7877 2024-05-28 16:43:50,183 - mmdet - INFO - Epoch [4][5700/7330] lr: 1.000e-04, eta: 15:35:45, time: 0.966, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0486, loss_cls: 0.2068, acc: 92.4519, loss_bbox: 0.2590, loss_mask: 0.2563, loss: 0.7929 2024-05-28 16:44:35,924 - mmdet - INFO - Epoch [4][5750/7330] lr: 1.000e-04, eta: 15:34:56, time: 0.915, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0464, loss_cls: 0.2018, acc: 92.8074, loss_bbox: 0.2445, loss_mask: 0.2465, loss: 0.7631 2024-05-28 16:45:21,419 - mmdet - INFO - Epoch [4][5800/7330] lr: 1.000e-04, eta: 15:34:07, time: 0.910, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0466, loss_cls: 0.1989, acc: 92.8674, loss_bbox: 0.2509, loss_mask: 0.2467, loss: 0.7661 2024-05-28 16:46:07,231 - mmdet - INFO - Epoch [4][5850/7330] lr: 1.000e-04, eta: 15:33:19, time: 0.917, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0497, loss_cls: 0.2045, acc: 92.6077, loss_bbox: 0.2629, loss_mask: 0.2531, loss: 0.7932 2024-05-28 16:46:53,028 - mmdet - INFO - Epoch [4][5900/7330] lr: 1.000e-04, eta: 15:32:31, time: 0.916, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0472, loss_cls: 0.2028, acc: 92.7256, loss_bbox: 0.2532, loss_mask: 0.2511, loss: 0.7763 2024-05-28 16:47:39,379 - mmdet - INFO - Epoch [4][5950/7330] lr: 1.000e-04, eta: 15:31:44, time: 0.927, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0488, loss_cls: 0.1989, acc: 92.8938, loss_bbox: 0.2478, loss_mask: 0.2447, loss: 0.7618 2024-05-28 16:48:29,279 - mmdet - INFO - Epoch [4][6000/7330] lr: 1.000e-04, eta: 15:31:04, time: 0.998, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0469, loss_cls: 0.2012, acc: 92.8047, loss_bbox: 0.2489, loss_mask: 0.2443, loss: 0.7625 2024-05-28 16:49:15,141 - mmdet - INFO - Epoch [4][6050/7330] lr: 1.000e-04, eta: 15:30:16, time: 0.917, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0448, loss_cls: 0.1927, acc: 93.0754, loss_bbox: 0.2431, loss_mask: 0.2442, loss: 0.7457 2024-05-28 16:50:00,531 - mmdet - INFO - Epoch [4][6100/7330] lr: 1.000e-04, eta: 15:29:27, time: 0.908, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0453, loss_cls: 0.1934, acc: 93.1787, loss_bbox: 0.2362, loss_mask: 0.2466, loss: 0.7424 2024-05-28 16:50:49,578 - mmdet - INFO - Epoch [4][6150/7330] lr: 1.000e-04, eta: 15:28:46, time: 0.981, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0512, loss_cls: 0.2117, acc: 92.4465, loss_bbox: 0.2589, loss_mask: 0.2531, loss: 0.7981 2024-05-28 16:51:35,557 - mmdet - INFO - Epoch [4][6200/7330] lr: 1.000e-04, eta: 15:27:58, time: 0.920, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0493, loss_cls: 0.1969, acc: 92.8511, loss_bbox: 0.2435, loss_mask: 0.2484, loss: 0.7608 2024-05-28 16:52:24,814 - mmdet - INFO - Epoch [4][6250/7330] lr: 1.000e-04, eta: 15:27:17, time: 0.985, data_time: 0.065, memory: 20925, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0512, loss_cls: 0.2011, acc: 92.8333, loss_bbox: 0.2536, loss_mask: 0.2503, loss: 0.7781 2024-05-28 16:53:10,432 - mmdet - INFO - Epoch [4][6300/7330] lr: 1.000e-04, eta: 15:26:29, time: 0.912, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0474, loss_cls: 0.2001, acc: 92.7100, loss_bbox: 0.2567, loss_mask: 0.2510, loss: 0.7765 2024-05-28 16:53:56,993 - mmdet - INFO - Epoch [4][6350/7330] lr: 1.000e-04, eta: 15:25:42, time: 0.931, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0505, loss_cls: 0.2098, acc: 92.4712, loss_bbox: 0.2586, loss_mask: 0.2519, loss: 0.7952 2024-05-28 16:54:43,294 - mmdet - INFO - Epoch [4][6400/7330] lr: 1.000e-04, eta: 15:24:55, time: 0.926, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0454, loss_cls: 0.1983, acc: 92.8606, loss_bbox: 0.2462, loss_mask: 0.2523, loss: 0.7641 2024-05-28 16:55:33,154 - mmdet - INFO - Epoch [4][6450/7330] lr: 1.000e-04, eta: 15:24:15, time: 0.997, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0462, loss_cls: 0.2062, acc: 92.7141, loss_bbox: 0.2502, loss_mask: 0.2468, loss: 0.7710 2024-05-28 16:56:22,437 - mmdet - INFO - Epoch [4][6500/7330] lr: 1.000e-04, eta: 15:23:34, time: 0.986, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0474, loss_cls: 0.2075, acc: 92.5793, loss_bbox: 0.2547, loss_mask: 0.2489, loss: 0.7797 2024-05-28 16:57:08,636 - mmdet - INFO - Epoch [4][6550/7330] lr: 1.000e-04, eta: 15:22:47, time: 0.924, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0469, loss_cls: 0.2100, acc: 92.4817, loss_bbox: 0.2531, loss_mask: 0.2494, loss: 0.7822 2024-05-28 16:57:57,290 - mmdet - INFO - Epoch [4][6600/7330] lr: 1.000e-04, eta: 15:22:04, time: 0.973, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0485, loss_cls: 0.2035, acc: 92.5920, loss_bbox: 0.2529, loss_mask: 0.2476, loss: 0.7749 2024-05-28 16:58:42,489 - mmdet - INFO - Epoch [4][6650/7330] lr: 1.000e-04, eta: 15:21:15, time: 0.904, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0455, loss_cls: 0.2044, acc: 92.7200, loss_bbox: 0.2452, loss_mask: 0.2424, loss: 0.7589 2024-05-28 16:59:27,996 - mmdet - INFO - Epoch [4][6700/7330] lr: 1.000e-04, eta: 15:20:26, time: 0.910, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0479, loss_cls: 0.2025, acc: 92.8015, loss_bbox: 0.2513, loss_mask: 0.2456, loss: 0.7695 2024-05-28 17:00:15,727 - mmdet - INFO - Epoch [4][6750/7330] lr: 1.000e-04, eta: 15:19:42, time: 0.955, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0477, loss_cls: 0.2019, acc: 92.7766, loss_bbox: 0.2493, loss_mask: 0.2460, loss: 0.7649 2024-05-28 17:01:01,865 - mmdet - INFO - Epoch [4][6800/7330] lr: 1.000e-04, eta: 15:18:54, time: 0.923, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0481, loss_cls: 0.1985, acc: 92.8855, loss_bbox: 0.2474, loss_mask: 0.2435, loss: 0.7606 2024-05-28 17:01:47,880 - mmdet - INFO - Epoch [4][6850/7330] lr: 1.000e-04, eta: 15:18:07, time: 0.920, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0480, loss_cls: 0.1956, acc: 92.9070, loss_bbox: 0.2447, loss_mask: 0.2439, loss: 0.7528 2024-05-28 17:02:33,377 - mmdet - INFO - Epoch [4][6900/7330] lr: 1.000e-04, eta: 15:17:18, time: 0.910, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0455, loss_cls: 0.1928, acc: 92.9658, loss_bbox: 0.2455, loss_mask: 0.2462, loss: 0.7489 2024-05-28 17:03:19,844 - mmdet - INFO - Epoch [4][6950/7330] lr: 1.000e-04, eta: 15:16:31, time: 0.929, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0481, loss_cls: 0.2051, acc: 92.5347, loss_bbox: 0.2507, loss_mask: 0.2444, loss: 0.7697 2024-05-28 17:04:05,950 - mmdet - INFO - Epoch [4][7000/7330] lr: 1.000e-04, eta: 15:15:43, time: 0.922, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0446, loss_cls: 0.1988, acc: 92.8608, loss_bbox: 0.2451, loss_mask: 0.2447, loss: 0.7537 2024-05-28 17:04:56,278 - mmdet - INFO - Epoch [4][7050/7330] lr: 1.000e-04, eta: 15:15:04, time: 1.007, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0461, loss_cls: 0.2070, acc: 92.5859, loss_bbox: 0.2539, loss_mask: 0.2471, loss: 0.7768 2024-05-28 17:05:42,030 - mmdet - INFO - Epoch [4][7100/7330] lr: 1.000e-04, eta: 15:14:16, time: 0.915, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0486, loss_cls: 0.2056, acc: 92.5535, loss_bbox: 0.2547, loss_mask: 0.2528, loss: 0.7862 2024-05-28 17:06:28,169 - mmdet - INFO - Epoch [4][7150/7330] lr: 1.000e-04, eta: 15:13:29, time: 0.923, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0460, loss_cls: 0.1915, acc: 93.1094, loss_bbox: 0.2360, loss_mask: 0.2476, loss: 0.7434 2024-05-28 17:07:16,337 - mmdet - INFO - Epoch [4][7200/7330] lr: 1.000e-04, eta: 15:12:45, time: 0.963, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0462, loss_cls: 0.1993, acc: 92.8855, loss_bbox: 0.2421, loss_mask: 0.2425, loss: 0.7481 2024-05-28 17:08:02,286 - mmdet - INFO - Epoch [4][7250/7330] lr: 1.000e-04, eta: 15:11:57, time: 0.919, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0465, loss_cls: 0.2022, acc: 92.6711, loss_bbox: 0.2522, loss_mask: 0.2525, loss: 0.7740 2024-05-28 17:08:50,042 - mmdet - INFO - Epoch [4][7300/7330] lr: 1.000e-04, eta: 15:11:13, time: 0.955, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0486, loss_cls: 0.2050, acc: 92.5723, loss_bbox: 0.2517, loss_mask: 0.2461, loss: 0.7734 2024-05-28 17:09:18,326 - mmdet - INFO - Saving checkpoint at 4 epochs 2024-05-28 17:11:28,092 - mmdet - INFO - Evaluating bbox... 2024-05-28 17:11:52,669 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.440 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.685 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.485 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.270 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.489 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.590 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.567 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.567 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.567 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.375 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.617 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.722 2024-05-28 17:11:52,669 - mmdet - INFO - Evaluating segm... 2024-05-28 17:12:18,554 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.401 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.650 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.425 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.191 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.444 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.620 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.304 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.570 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.705 2024-05-28 17:12:18,962 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 17:12:18,964 - mmdet - INFO - Epoch(val) [4][625] bbox_mAP: 0.4400, bbox_mAP_50: 0.6850, bbox_mAP_75: 0.4850, bbox_mAP_s: 0.2700, bbox_mAP_m: 0.4890, bbox_mAP_l: 0.5900, bbox_mAP_copypaste: 0.440 0.685 0.485 0.270 0.489 0.590, segm_mAP: 0.4010, segm_mAP_50: 0.6500, segm_mAP_75: 0.4250, segm_mAP_s: 0.1910, segm_mAP_m: 0.4440, segm_mAP_l: 0.6200, segm_mAP_copypaste: 0.401 0.650 0.425 0.191 0.444 0.620 2024-05-28 17:13:18,366 - mmdet - INFO - Epoch [5][50/7330] lr: 1.000e-04, eta: 15:09:28, time: 1.188, data_time: 0.111, memory: 20925, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0431, loss_cls: 0.1838, acc: 93.3630, loss_bbox: 0.2388, loss_mask: 0.2362, loss: 0.7210 2024-05-28 17:14:04,306 - mmdet - INFO - Epoch [5][100/7330] lr: 1.000e-04, eta: 15:08:40, time: 0.919, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0485, loss_cls: 0.1916, acc: 92.9919, loss_bbox: 0.2440, loss_mask: 0.2365, loss: 0.7418 2024-05-28 17:14:52,379 - mmdet - INFO - Epoch [5][150/7330] lr: 1.000e-04, eta: 15:07:57, time: 0.961, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0471, loss_cls: 0.1891, acc: 93.0105, loss_bbox: 0.2433, loss_mask: 0.2390, loss: 0.7370 2024-05-28 17:15:38,909 - mmdet - INFO - Epoch [5][200/7330] lr: 1.000e-04, eta: 15:07:10, time: 0.931, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0488, loss_cls: 0.1988, acc: 92.7683, loss_bbox: 0.2456, loss_mask: 0.2442, loss: 0.7593 2024-05-28 17:16:24,470 - mmdet - INFO - Epoch [5][250/7330] lr: 1.000e-04, eta: 15:06:22, time: 0.911, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0464, loss_cls: 0.1844, acc: 93.1084, loss_bbox: 0.2395, loss_mask: 0.2433, loss: 0.7334 2024-05-28 17:17:10,582 - mmdet - INFO - Epoch [5][300/7330] lr: 1.000e-04, eta: 15:05:34, time: 0.922, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0486, loss_cls: 0.1991, acc: 92.7310, loss_bbox: 0.2527, loss_mask: 0.2464, loss: 0.7679 2024-05-28 17:17:56,925 - mmdet - INFO - Epoch [5][350/7330] lr: 1.000e-04, eta: 15:04:47, time: 0.927, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0442, loss_cls: 0.1865, acc: 93.1990, loss_bbox: 0.2408, loss_mask: 0.2386, loss: 0.7297 2024-05-28 17:18:45,556 - mmdet - INFO - Epoch [5][400/7330] lr: 1.000e-04, eta: 15:04:05, time: 0.973, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0479, loss_cls: 0.1995, acc: 92.7961, loss_bbox: 0.2500, loss_mask: 0.2413, loss: 0.7595 2024-05-28 17:19:31,385 - mmdet - INFO - Epoch [5][450/7330] lr: 1.000e-04, eta: 15:03:17, time: 0.917, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0472, loss_cls: 0.1948, acc: 92.9021, loss_bbox: 0.2489, loss_mask: 0.2424, loss: 0.7532 2024-05-28 17:20:18,255 - mmdet - INFO - Epoch [5][500/7330] lr: 1.000e-04, eta: 15:02:31, time: 0.937, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0460, loss_cls: 0.2007, acc: 92.6003, loss_bbox: 0.2522, loss_mask: 0.2453, loss: 0.7644 2024-05-28 17:21:04,740 - mmdet - INFO - Epoch [5][550/7330] lr: 1.000e-04, eta: 15:01:44, time: 0.930, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0466, loss_cls: 0.1973, acc: 92.7585, loss_bbox: 0.2504, loss_mask: 0.2433, loss: 0.7572 2024-05-28 17:21:50,162 - mmdet - INFO - Epoch [5][600/7330] lr: 1.000e-04, eta: 15:00:55, time: 0.908, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0454, loss_cls: 0.1873, acc: 93.2615, loss_bbox: 0.2346, loss_mask: 0.2407, loss: 0.7281 2024-05-28 17:22:36,095 - mmdet - INFO - Epoch [5][650/7330] lr: 1.000e-04, eta: 15:00:07, time: 0.919, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0456, loss_cls: 0.1909, acc: 93.0122, loss_bbox: 0.2402, loss_mask: 0.2377, loss: 0.7335 2024-05-28 17:23:22,962 - mmdet - INFO - Epoch [5][700/7330] lr: 1.000e-04, eta: 14:59:21, time: 0.937, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0468, loss_cls: 0.1880, acc: 93.1489, loss_bbox: 0.2406, loss_mask: 0.2402, loss: 0.7350 2024-05-28 17:24:13,939 - mmdet - INFO - Epoch [5][750/7330] lr: 1.000e-04, eta: 14:58:43, time: 1.020, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0478, loss_cls: 0.1951, acc: 92.8000, loss_bbox: 0.2494, loss_mask: 0.2478, loss: 0.7619 2024-05-28 17:25:00,046 - mmdet - INFO - Epoch [5][800/7330] lr: 1.000e-04, eta: 14:57:56, time: 0.922, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0465, loss_cls: 0.1877, acc: 93.1060, loss_bbox: 0.2333, loss_mask: 0.2356, loss: 0.7234 2024-05-28 17:25:45,904 - mmdet - INFO - Epoch [5][850/7330] lr: 1.000e-04, eta: 14:57:08, time: 0.917, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0475, loss_cls: 0.1912, acc: 92.9341, loss_bbox: 0.2474, loss_mask: 0.2435, loss: 0.7501 2024-05-28 17:26:31,633 - mmdet - INFO - Epoch [5][900/7330] lr: 1.000e-04, eta: 14:56:20, time: 0.915, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0437, loss_cls: 0.1967, acc: 92.9973, loss_bbox: 0.2480, loss_mask: 0.2453, loss: 0.7522 2024-05-28 17:27:17,612 - mmdet - INFO - Epoch [5][950/7330] lr: 1.000e-04, eta: 14:55:32, time: 0.920, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0450, loss_cls: 0.1942, acc: 92.9534, loss_bbox: 0.2431, loss_mask: 0.2441, loss: 0.7478 2024-05-28 17:28:03,568 - mmdet - INFO - Epoch [5][1000/7330] lr: 1.000e-04, eta: 14:54:44, time: 0.919, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0491, loss_cls: 0.1948, acc: 92.8535, loss_bbox: 0.2503, loss_mask: 0.2447, loss: 0.7597 2024-05-28 17:28:49,013 - mmdet - INFO - Epoch [5][1050/7330] lr: 1.000e-04, eta: 14:53:55, time: 0.909, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0448, loss_cls: 0.1892, acc: 93.0808, loss_bbox: 0.2425, loss_mask: 0.2379, loss: 0.7329 2024-05-28 17:29:34,546 - mmdet - INFO - Epoch [5][1100/7330] lr: 1.000e-04, eta: 14:53:07, time: 0.911, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0466, loss_cls: 0.1945, acc: 92.9941, loss_bbox: 0.2397, loss_mask: 0.2444, loss: 0.7460 2024-05-28 17:30:21,010 - mmdet - INFO - Epoch [5][1150/7330] lr: 1.000e-04, eta: 14:52:20, time: 0.929, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0483, loss_cls: 0.1948, acc: 92.9326, loss_bbox: 0.2481, loss_mask: 0.2472, loss: 0.7591 2024-05-28 17:31:11,589 - mmdet - INFO - Epoch [5][1200/7330] lr: 1.000e-04, eta: 14:51:41, time: 1.012, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0440, loss_cls: 0.1920, acc: 93.0938, loss_bbox: 0.2421, loss_mask: 0.2421, loss: 0.7388 2024-05-28 17:31:57,125 - mmdet - INFO - Epoch [5][1250/7330] lr: 1.000e-04, eta: 14:50:53, time: 0.911, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0450, loss_cls: 0.1875, acc: 93.2358, loss_bbox: 0.2355, loss_mask: 0.2382, loss: 0.7262 2024-05-28 17:32:45,839 - mmdet - INFO - Epoch [5][1300/7330] lr: 1.000e-04, eta: 14:50:10, time: 0.974, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0472, loss_cls: 0.1919, acc: 92.9958, loss_bbox: 0.2417, loss_mask: 0.2400, loss: 0.7412 2024-05-28 17:33:31,529 - mmdet - INFO - Epoch [5][1350/7330] lr: 1.000e-04, eta: 14:49:22, time: 0.914, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0446, loss_cls: 0.1934, acc: 92.9917, loss_bbox: 0.2439, loss_mask: 0.2410, loss: 0.7436 2024-05-28 17:34:18,089 - mmdet - INFO - Epoch [5][1400/7330] lr: 1.000e-04, eta: 14:48:35, time: 0.931, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0448, loss_cls: 0.1903, acc: 93.1450, loss_bbox: 0.2387, loss_mask: 0.2367, loss: 0.7293 2024-05-28 17:35:09,153 - mmdet - INFO - Epoch [5][1450/7330] lr: 1.000e-04, eta: 14:47:57, time: 1.021, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0471, loss_cls: 0.2003, acc: 92.6880, loss_bbox: 0.2500, loss_mask: 0.2456, loss: 0.7622 2024-05-28 17:35:57,243 - mmdet - INFO - Epoch [5][1500/7330] lr: 1.000e-04, eta: 14:47:13, time: 0.962, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0432, loss_cls: 0.1903, acc: 93.1326, loss_bbox: 0.2353, loss_mask: 0.2395, loss: 0.7273 2024-05-28 17:36:42,974 - mmdet - INFO - Epoch [5][1550/7330] lr: 1.000e-04, eta: 14:46:25, time: 0.915, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0439, loss_cls: 0.1885, acc: 92.9583, loss_bbox: 0.2412, loss_mask: 0.2412, loss: 0.7351 2024-05-28 17:37:29,508 - mmdet - INFO - Epoch [5][1600/7330] lr: 1.000e-04, eta: 14:45:38, time: 0.931, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0480, loss_cls: 0.1916, acc: 92.9263, loss_bbox: 0.2482, loss_mask: 0.2401, loss: 0.7470 2024-05-28 17:38:15,763 - mmdet - INFO - Epoch [5][1650/7330] lr: 1.000e-04, eta: 14:44:51, time: 0.925, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0460, loss_cls: 0.1970, acc: 92.8445, loss_bbox: 0.2490, loss_mask: 0.2444, loss: 0.7567 2024-05-28 17:39:01,951 - mmdet - INFO - Epoch [5][1700/7330] lr: 1.000e-04, eta: 14:44:04, time: 0.924, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0482, loss_cls: 0.1885, acc: 93.1147, loss_bbox: 0.2415, loss_mask: 0.2481, loss: 0.7480 2024-05-28 17:39:47,630 - mmdet - INFO - Epoch [5][1750/7330] lr: 1.000e-04, eta: 14:43:16, time: 0.914, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0443, loss_cls: 0.1821, acc: 93.3174, loss_bbox: 0.2349, loss_mask: 0.2335, loss: 0.7144 2024-05-28 17:40:39,037 - mmdet - INFO - Epoch [5][1800/7330] lr: 1.000e-04, eta: 14:42:38, time: 1.028, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0471, loss_cls: 0.1953, acc: 92.9487, loss_bbox: 0.2451, loss_mask: 0.2481, loss: 0.7558 2024-05-28 17:41:24,701 - mmdet - INFO - Epoch [5][1850/7330] lr: 1.000e-04, eta: 14:41:50, time: 0.913, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0438, loss_cls: 0.1812, acc: 93.3489, loss_bbox: 0.2315, loss_mask: 0.2384, loss: 0.7124 2024-05-28 17:42:09,805 - mmdet - INFO - Epoch [5][1900/7330] lr: 1.000e-04, eta: 14:41:00, time: 0.902, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0458, loss_cls: 0.1907, acc: 93.0549, loss_bbox: 0.2419, loss_mask: 0.2408, loss: 0.7387 2024-05-28 17:42:56,403 - mmdet - INFO - Epoch [5][1950/7330] lr: 1.000e-04, eta: 14:40:14, time: 0.932, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0470, loss_cls: 0.2090, acc: 92.4507, loss_bbox: 0.2566, loss_mask: 0.2429, loss: 0.7759 2024-05-28 17:43:42,322 - mmdet - INFO - Epoch [5][2000/7330] lr: 1.000e-04, eta: 14:39:26, time: 0.918, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0454, loss_cls: 0.1886, acc: 93.0200, loss_bbox: 0.2393, loss_mask: 0.2388, loss: 0.7306 2024-05-28 17:44:28,119 - mmdet - INFO - Epoch [5][2050/7330] lr: 1.000e-04, eta: 14:38:38, time: 0.916, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0489, loss_cls: 0.1968, acc: 92.7310, loss_bbox: 0.2512, loss_mask: 0.2458, loss: 0.7631 2024-05-28 17:45:13,516 - mmdet - INFO - Epoch [5][2100/7330] lr: 1.000e-04, eta: 14:37:49, time: 0.908, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0463, loss_cls: 0.1994, acc: 92.7358, loss_bbox: 0.2482, loss_mask: 0.2396, loss: 0.7528 2024-05-28 17:45:59,922 - mmdet - INFO - Epoch [5][2150/7330] lr: 1.000e-04, eta: 14:37:02, time: 0.928, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0464, loss_cls: 0.1984, acc: 92.7554, loss_bbox: 0.2472, loss_mask: 0.2452, loss: 0.7566 2024-05-28 17:46:45,312 - mmdet - INFO - Epoch [5][2200/7330] lr: 1.000e-04, eta: 14:36:14, time: 0.908, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0448, loss_cls: 0.1872, acc: 93.2344, loss_bbox: 0.2418, loss_mask: 0.2408, loss: 0.7353 2024-05-28 17:47:36,550 - mmdet - INFO - Epoch [5][2250/7330] lr: 1.000e-04, eta: 14:35:36, time: 1.025, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0455, loss_cls: 0.1904, acc: 93.1484, loss_bbox: 0.2391, loss_mask: 0.2399, loss: 0.7349 2024-05-28 17:48:22,654 - mmdet - INFO - Epoch [5][2300/7330] lr: 1.000e-04, eta: 14:34:48, time: 0.922, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0476, loss_cls: 0.1961, acc: 92.7776, loss_bbox: 0.2453, loss_mask: 0.2447, loss: 0.7550 2024-05-28 17:49:12,413 - mmdet - INFO - Epoch [5][2350/7330] lr: 1.000e-04, eta: 14:34:07, time: 0.995, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0489, loss_cls: 0.1966, acc: 92.8132, loss_bbox: 0.2459, loss_mask: 0.2431, loss: 0.7568 2024-05-28 17:49:59,002 - mmdet - INFO - Epoch [5][2400/7330] lr: 1.000e-04, eta: 14:33:21, time: 0.932, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0479, loss_cls: 0.2063, acc: 92.5193, loss_bbox: 0.2498, loss_mask: 0.2496, loss: 0.7751 2024-05-28 17:50:44,831 - mmdet - INFO - Epoch [5][2450/7330] lr: 1.000e-04, eta: 14:32:33, time: 0.917, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0486, loss_cls: 0.2050, acc: 92.5239, loss_bbox: 0.2552, loss_mask: 0.2429, loss: 0.7724 2024-05-28 17:51:30,350 - mmdet - INFO - Epoch [5][2500/7330] lr: 1.000e-04, eta: 14:31:44, time: 0.910, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0467, loss_cls: 0.1910, acc: 93.1875, loss_bbox: 0.2382, loss_mask: 0.2407, loss: 0.7354 2024-05-28 17:52:22,528 - mmdet - INFO - Epoch [5][2550/7330] lr: 1.000e-04, eta: 14:31:07, time: 1.044, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0441, loss_cls: 0.1852, acc: 93.2273, loss_bbox: 0.2329, loss_mask: 0.2414, loss: 0.7227 2024-05-28 17:53:08,894 - mmdet - INFO - Epoch [5][2600/7330] lr: 1.000e-04, eta: 14:30:20, time: 0.927, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0471, loss_cls: 0.1962, acc: 92.7646, loss_bbox: 0.2471, loss_mask: 0.2454, loss: 0.7546 2024-05-28 17:53:54,951 - mmdet - INFO - Epoch [5][2650/7330] lr: 1.000e-04, eta: 14:29:33, time: 0.921, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0454, loss_cls: 0.1952, acc: 92.8479, loss_bbox: 0.2488, loss_mask: 0.2424, loss: 0.7504 2024-05-28 17:54:40,637 - mmdet - INFO - Epoch [5][2700/7330] lr: 1.000e-04, eta: 14:28:45, time: 0.914, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0480, loss_cls: 0.2038, acc: 92.5703, loss_bbox: 0.2526, loss_mask: 0.2425, loss: 0.7686 2024-05-28 17:55:26,764 - mmdet - INFO - Epoch [5][2750/7330] lr: 1.000e-04, eta: 14:27:57, time: 0.923, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0446, loss_cls: 0.1908, acc: 93.0369, loss_bbox: 0.2363, loss_mask: 0.2341, loss: 0.7250 2024-05-28 17:56:12,021 - mmdet - INFO - Epoch [5][2800/7330] lr: 1.000e-04, eta: 14:27:08, time: 0.905, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0438, loss_cls: 0.1893, acc: 93.2073, loss_bbox: 0.2355, loss_mask: 0.2374, loss: 0.7242 2024-05-28 17:57:03,229 - mmdet - INFO - Epoch [5][2850/7330] lr: 1.000e-04, eta: 14:26:30, time: 1.024, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0473, loss_cls: 0.1973, acc: 92.8665, loss_bbox: 0.2458, loss_mask: 0.2440, loss: 0.7542 2024-05-28 17:57:49,326 - mmdet - INFO - Epoch [5][2900/7330] lr: 1.000e-04, eta: 14:25:42, time: 0.922, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0478, loss_cls: 0.1916, acc: 92.9253, loss_bbox: 0.2470, loss_mask: 0.2377, loss: 0.7443 2024-05-28 17:58:34,957 - mmdet - INFO - Epoch [5][2950/7330] lr: 1.000e-04, eta: 14:24:54, time: 0.913, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0439, loss_cls: 0.1852, acc: 93.2634, loss_bbox: 0.2346, loss_mask: 0.2310, loss: 0.7148 2024-05-28 17:59:20,950 - mmdet - INFO - Epoch [5][3000/7330] lr: 1.000e-04, eta: 14:24:07, time: 0.920, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0452, loss_cls: 0.1875, acc: 93.1340, loss_bbox: 0.2330, loss_mask: 0.2371, loss: 0.7220 2024-05-28 18:00:07,557 - mmdet - INFO - Epoch [5][3050/7330] lr: 1.000e-04, eta: 14:23:20, time: 0.932, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0479, loss_cls: 0.2028, acc: 92.6587, loss_bbox: 0.2477, loss_mask: 0.2405, loss: 0.7596 2024-05-28 18:00:52,855 - mmdet - INFO - Epoch [5][3100/7330] lr: 1.000e-04, eta: 14:22:31, time: 0.906, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0451, loss_cls: 0.1863, acc: 93.1948, loss_bbox: 0.2412, loss_mask: 0.2397, loss: 0.7311 2024-05-28 18:01:38,261 - mmdet - INFO - Epoch [5][3150/7330] lr: 1.000e-04, eta: 14:21:43, time: 0.908, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0456, loss_cls: 0.1912, acc: 92.9199, loss_bbox: 0.2468, loss_mask: 0.2401, loss: 0.7434 2024-05-28 18:02:23,508 - mmdet - INFO - Epoch [5][3200/7330] lr: 1.000e-04, eta: 14:20:54, time: 0.905, data_time: 0.037, memory: 20925, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0424, loss_cls: 0.1880, acc: 93.1641, loss_bbox: 0.2337, loss_mask: 0.2403, loss: 0.7225 2024-05-28 18:03:09,596 - mmdet - INFO - Epoch [5][3250/7330] lr: 1.000e-04, eta: 14:20:06, time: 0.922, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0466, loss_cls: 0.1913, acc: 93.0271, loss_bbox: 0.2404, loss_mask: 0.2372, loss: 0.7335 2024-05-28 18:03:58,420 - mmdet - INFO - Epoch [5][3300/7330] lr: 1.000e-04, eta: 14:19:23, time: 0.976, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0480, loss_cls: 0.2014, acc: 92.5837, loss_bbox: 0.2529, loss_mask: 0.2450, loss: 0.7686 2024-05-28 18:04:47,183 - mmdet - INFO - Epoch [5][3350/7330] lr: 1.000e-04, eta: 14:18:41, time: 0.975, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0476, loss_cls: 0.1976, acc: 92.7622, loss_bbox: 0.2503, loss_mask: 0.2437, loss: 0.7598 2024-05-28 18:05:35,153 - mmdet - INFO - Epoch [5][3400/7330] lr: 1.000e-04, eta: 14:17:56, time: 0.959, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0446, loss_cls: 0.1863, acc: 93.1990, loss_bbox: 0.2390, loss_mask: 0.2418, loss: 0.7306 2024-05-28 18:06:21,803 - mmdet - INFO - Epoch [5][3450/7330] lr: 1.000e-04, eta: 14:17:10, time: 0.933, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0477, loss_cls: 0.1993, acc: 92.7473, loss_bbox: 0.2503, loss_mask: 0.2485, loss: 0.7676 2024-05-28 18:07:08,237 - mmdet - INFO - Epoch [5][3500/7330] lr: 1.000e-04, eta: 14:16:23, time: 0.929, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0478, loss_cls: 0.1947, acc: 92.8621, loss_bbox: 0.2428, loss_mask: 0.2466, loss: 0.7513 2024-05-28 18:07:53,740 - mmdet - INFO - Epoch [5][3550/7330] lr: 1.000e-04, eta: 14:15:34, time: 0.910, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0474, loss_cls: 0.1987, acc: 92.7542, loss_bbox: 0.2472, loss_mask: 0.2451, loss: 0.7583 2024-05-28 18:08:45,886 - mmdet - INFO - Epoch [5][3600/7330] lr: 1.000e-04, eta: 14:14:57, time: 1.043, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0433, loss_cls: 0.1934, acc: 93.0532, loss_bbox: 0.2382, loss_mask: 0.2344, loss: 0.7281 2024-05-28 18:09:32,143 - mmdet - INFO - Epoch [5][3650/7330] lr: 1.000e-04, eta: 14:14:10, time: 0.925, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0482, loss_cls: 0.1963, acc: 92.9641, loss_bbox: 0.2408, loss_mask: 0.2353, loss: 0.7406 2024-05-28 18:10:18,468 - mmdet - INFO - Epoch [5][3700/7330] lr: 1.000e-04, eta: 14:13:23, time: 0.927, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0481, loss_cls: 0.1918, acc: 93.1338, loss_bbox: 0.2385, loss_mask: 0.2375, loss: 0.7358 2024-05-28 18:11:04,265 - mmdet - INFO - Epoch [5][3750/7330] lr: 1.000e-04, eta: 14:12:35, time: 0.916, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0437, loss_cls: 0.1744, acc: 93.5991, loss_bbox: 0.2224, loss_mask: 0.2269, loss: 0.6859 2024-05-28 18:11:50,246 - mmdet - INFO - Epoch [5][3800/7330] lr: 1.000e-04, eta: 14:11:47, time: 0.920, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0463, loss_cls: 0.1975, acc: 92.8350, loss_bbox: 0.2466, loss_mask: 0.2444, loss: 0.7533 2024-05-28 18:12:35,768 - mmdet - INFO - Epoch [5][3850/7330] lr: 1.000e-04, eta: 14:10:59, time: 0.910, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0481, loss_cls: 0.1989, acc: 92.6658, loss_bbox: 0.2541, loss_mask: 0.2427, loss: 0.7627 2024-05-28 18:13:26,569 - mmdet - INFO - Epoch [5][3900/7330] lr: 1.000e-04, eta: 14:10:19, time: 1.016, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0454, loss_cls: 0.1875, acc: 93.1807, loss_bbox: 0.2305, loss_mask: 0.2345, loss: 0.7168 2024-05-28 18:14:13,027 - mmdet - INFO - Epoch [5][3950/7330] lr: 1.000e-04, eta: 14:09:33, time: 0.929, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0457, loss_cls: 0.1963, acc: 92.7676, loss_bbox: 0.2500, loss_mask: 0.2378, loss: 0.7492 2024-05-28 18:14:58,642 - mmdet - INFO - Epoch [5][4000/7330] lr: 1.000e-04, eta: 14:08:44, time: 0.912, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0465, loss_cls: 0.1926, acc: 92.9434, loss_bbox: 0.2414, loss_mask: 0.2414, loss: 0.7421 2024-05-28 18:15:44,542 - mmdet - INFO - Epoch [5][4050/7330] lr: 1.000e-04, eta: 14:07:57, time: 0.918, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0462, loss_cls: 0.1968, acc: 92.8137, loss_bbox: 0.2519, loss_mask: 0.2447, loss: 0.7595 2024-05-28 18:16:30,372 - mmdet - INFO - Epoch [5][4100/7330] lr: 1.000e-04, eta: 14:07:09, time: 0.917, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0469, loss_cls: 0.1942, acc: 92.9067, loss_bbox: 0.2450, loss_mask: 0.2405, loss: 0.7464 2024-05-28 18:17:16,630 - mmdet - INFO - Epoch [5][4150/7330] lr: 1.000e-04, eta: 14:06:22, time: 0.925, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0449, loss_cls: 0.1924, acc: 92.9558, loss_bbox: 0.2428, loss_mask: 0.2426, loss: 0.7424 2024-05-28 18:18:02,451 - mmdet - INFO - Epoch [5][4200/7330] lr: 1.000e-04, eta: 14:05:34, time: 0.916, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0445, loss_cls: 0.1930, acc: 93.0171, loss_bbox: 0.2435, loss_mask: 0.2476, loss: 0.7475 2024-05-28 18:18:48,104 - mmdet - INFO - Epoch [5][4250/7330] lr: 1.000e-04, eta: 14:04:46, time: 0.913, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0441, loss_cls: 0.1881, acc: 93.1238, loss_bbox: 0.2363, loss_mask: 0.2369, loss: 0.7243 2024-05-28 18:19:33,498 - mmdet - INFO - Epoch [5][4300/7330] lr: 1.000e-04, eta: 14:03:57, time: 0.908, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0427, loss_cls: 0.1786, acc: 93.4507, loss_bbox: 0.2230, loss_mask: 0.2367, loss: 0.7016 2024-05-28 18:20:22,430 - mmdet - INFO - Epoch [5][4350/7330] lr: 1.000e-04, eta: 14:03:14, time: 0.979, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0466, loss_cls: 0.1937, acc: 93.0310, loss_bbox: 0.2412, loss_mask: 0.2451, loss: 0.7475 2024-05-28 18:21:11,296 - mmdet - INFO - Epoch [5][4400/7330] lr: 1.000e-04, eta: 14:02:31, time: 0.977, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0444, loss_cls: 0.1834, acc: 93.3604, loss_bbox: 0.2321, loss_mask: 0.2374, loss: 0.7163 2024-05-28 18:21:59,739 - mmdet - INFO - Epoch [5][4450/7330] lr: 1.000e-04, eta: 14:01:48, time: 0.969, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0440, loss_cls: 0.1873, acc: 93.2314, loss_bbox: 0.2361, loss_mask: 0.2354, loss: 0.7214 2024-05-28 18:22:45,444 - mmdet - INFO - Epoch [5][4500/7330] lr: 1.000e-04, eta: 14:01:00, time: 0.914, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0451, loss_cls: 0.1926, acc: 93.0435, loss_bbox: 0.2388, loss_mask: 0.2383, loss: 0.7341 2024-05-28 18:23:30,962 - mmdet - INFO - Epoch [5][4550/7330] lr: 1.000e-04, eta: 14:00:11, time: 0.910, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0450, loss_cls: 0.1962, acc: 92.8079, loss_bbox: 0.2440, loss_mask: 0.2419, loss: 0.7478 2024-05-28 18:24:16,498 - mmdet - INFO - Epoch [5][4600/7330] lr: 1.000e-04, eta: 13:59:23, time: 0.911, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0445, loss_cls: 0.1916, acc: 93.0466, loss_bbox: 0.2392, loss_mask: 0.2390, loss: 0.7338 2024-05-28 18:25:06,545 - mmdet - INFO - Epoch [5][4650/7330] lr: 1.000e-04, eta: 13:58:42, time: 1.001, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0463, loss_cls: 0.1914, acc: 93.0193, loss_bbox: 0.2460, loss_mask: 0.2409, loss: 0.7450 2024-05-28 18:25:55,670 - mmdet - INFO - Epoch [5][4700/7330] lr: 1.000e-04, eta: 13:57:59, time: 0.982, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0478, loss_cls: 0.2033, acc: 92.4941, loss_bbox: 0.2587, loss_mask: 0.2468, loss: 0.7779 2024-05-28 18:26:41,147 - mmdet - INFO - Epoch [5][4750/7330] lr: 1.000e-04, eta: 13:57:11, time: 0.910, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0445, loss_cls: 0.1842, acc: 93.2803, loss_bbox: 0.2346, loss_mask: 0.2478, loss: 0.7313 2024-05-28 18:27:26,884 - mmdet - INFO - Epoch [5][4800/7330] lr: 1.000e-04, eta: 13:56:23, time: 0.915, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0465, loss_cls: 0.1896, acc: 93.0461, loss_bbox: 0.2435, loss_mask: 0.2394, loss: 0.7380 2024-05-28 18:28:12,440 - mmdet - INFO - Epoch [5][4850/7330] lr: 1.000e-04, eta: 13:55:35, time: 0.911, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0481, loss_cls: 0.1964, acc: 92.9631, loss_bbox: 0.2398, loss_mask: 0.2356, loss: 0.7411 2024-05-28 18:28:58,064 - mmdet - INFO - Epoch [5][4900/7330] lr: 1.000e-04, eta: 13:54:46, time: 0.913, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0446, loss_cls: 0.1840, acc: 93.2817, loss_bbox: 0.2328, loss_mask: 0.2330, loss: 0.7142 2024-05-28 18:29:46,723 - mmdet - INFO - Epoch [5][4950/7330] lr: 1.000e-04, eta: 13:54:03, time: 0.973, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0456, loss_cls: 0.1984, acc: 92.7402, loss_bbox: 0.2483, loss_mask: 0.2486, loss: 0.7596 2024-05-28 18:30:34,563 - mmdet - INFO - Epoch [5][5000/7330] lr: 1.000e-04, eta: 13:53:18, time: 0.957, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0469, loss_cls: 0.1955, acc: 92.9368, loss_bbox: 0.2397, loss_mask: 0.2365, loss: 0.7372 2024-05-28 18:31:20,459 - mmdet - INFO - Epoch [5][5050/7330] lr: 1.000e-04, eta: 13:52:31, time: 0.918, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0454, loss_cls: 0.1943, acc: 92.9331, loss_bbox: 0.2393, loss_mask: 0.2385, loss: 0.7366 2024-05-28 18:32:06,512 - mmdet - INFO - Epoch [5][5100/7330] lr: 1.000e-04, eta: 13:51:43, time: 0.921, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0495, loss_cls: 0.1983, acc: 92.7632, loss_bbox: 0.2526, loss_mask: 0.2491, loss: 0.7723 2024-05-28 18:32:52,467 - mmdet - INFO - Epoch [5][5150/7330] lr: 1.000e-04, eta: 13:50:56, time: 0.919, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0483, loss_cls: 0.1971, acc: 92.7021, loss_bbox: 0.2523, loss_mask: 0.2450, loss: 0.7629 2024-05-28 18:33:38,523 - mmdet - INFO - Epoch [5][5200/7330] lr: 1.000e-04, eta: 13:50:08, time: 0.922, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0467, loss_cls: 0.1862, acc: 93.1255, loss_bbox: 0.2375, loss_mask: 0.2381, loss: 0.7273 2024-05-28 18:34:24,217 - mmdet - INFO - Epoch [5][5250/7330] lr: 1.000e-04, eta: 13:49:20, time: 0.914, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0449, loss_cls: 0.1940, acc: 92.9543, loss_bbox: 0.2368, loss_mask: 0.2451, loss: 0.7422 2024-05-28 18:35:10,500 - mmdet - INFO - Epoch [5][5300/7330] lr: 1.000e-04, eta: 13:48:33, time: 0.926, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0465, loss_cls: 0.1879, acc: 93.1299, loss_bbox: 0.2372, loss_mask: 0.2407, loss: 0.7335 2024-05-28 18:35:57,044 - mmdet - INFO - Epoch [5][5350/7330] lr: 1.000e-04, eta: 13:47:46, time: 0.931, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0491, loss_cls: 0.1944, acc: 92.9038, loss_bbox: 0.2484, loss_mask: 0.2445, loss: 0.7578 2024-05-28 18:36:43,185 - mmdet - INFO - Epoch [5][5400/7330] lr: 1.000e-04, eta: 13:46:59, time: 0.923, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0453, loss_cls: 0.1944, acc: 92.9304, loss_bbox: 0.2466, loss_mask: 0.2462, loss: 0.7520 2024-05-28 18:37:35,393 - mmdet - INFO - Epoch [5][5450/7330] lr: 1.000e-04, eta: 13:46:21, time: 1.044, data_time: 0.037, memory: 20925, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0435, loss_cls: 0.1834, acc: 93.3469, loss_bbox: 0.2293, loss_mask: 0.2387, loss: 0.7131 2024-05-28 18:38:21,576 - mmdet - INFO - Epoch [5][5500/7330] lr: 1.000e-04, eta: 13:45:34, time: 0.924, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0483, loss_cls: 0.1891, acc: 93.0510, loss_bbox: 0.2365, loss_mask: 0.2399, loss: 0.7344 2024-05-28 18:39:09,719 - mmdet - INFO - Epoch [5][5550/7330] lr: 1.000e-04, eta: 13:44:50, time: 0.963, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0461, loss_cls: 0.1942, acc: 92.9504, loss_bbox: 0.2404, loss_mask: 0.2396, loss: 0.7403 2024-05-28 18:39:55,666 - mmdet - INFO - Epoch [5][5600/7330] lr: 1.000e-04, eta: 13:44:02, time: 0.919, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0480, loss_cls: 0.1953, acc: 92.8206, loss_bbox: 0.2500, loss_mask: 0.2445, loss: 0.7564 2024-05-28 18:40:41,834 - mmdet - INFO - Epoch [5][5650/7330] lr: 1.000e-04, eta: 13:43:15, time: 0.923, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0458, loss_cls: 0.1914, acc: 93.0232, loss_bbox: 0.2365, loss_mask: 0.2384, loss: 0.7326 2024-05-28 18:41:33,317 - mmdet - INFO - Epoch [5][5700/7330] lr: 1.000e-04, eta: 13:42:35, time: 1.030, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0473, loss_cls: 0.2029, acc: 92.6880, loss_bbox: 0.2451, loss_mask: 0.2451, loss: 0.7610 2024-05-28 18:42:22,648 - mmdet - INFO - Epoch [5][5750/7330] lr: 1.000e-04, eta: 13:41:53, time: 0.987, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0482, loss_cls: 0.1882, acc: 93.1460, loss_bbox: 0.2350, loss_mask: 0.2416, loss: 0.7340 2024-05-28 18:43:09,281 - mmdet - INFO - Epoch [5][5800/7330] lr: 1.000e-04, eta: 13:41:06, time: 0.933, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0450, loss_cls: 0.1917, acc: 92.9885, loss_bbox: 0.2441, loss_mask: 0.2384, loss: 0.7388 2024-05-28 18:43:55,878 - mmdet - INFO - Epoch [5][5850/7330] lr: 1.000e-04, eta: 13:40:20, time: 0.932, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0464, loss_cls: 0.1964, acc: 92.9431, loss_bbox: 0.2442, loss_mask: 0.2418, loss: 0.7484 2024-05-28 18:44:41,369 - mmdet - INFO - Epoch [5][5900/7330] lr: 1.000e-04, eta: 13:39:31, time: 0.910, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0458, loss_cls: 0.1963, acc: 92.9424, loss_bbox: 0.2408, loss_mask: 0.2372, loss: 0.7407 2024-05-28 18:45:27,804 - mmdet - INFO - Epoch [5][5950/7330] lr: 1.000e-04, eta: 13:38:44, time: 0.929, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0480, loss_cls: 0.1976, acc: 92.8059, loss_bbox: 0.2457, loss_mask: 0.2439, loss: 0.7550 2024-05-28 18:46:15,521 - mmdet - INFO - Epoch [5][6000/7330] lr: 1.000e-04, eta: 13:37:59, time: 0.954, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0457, loss_cls: 0.1913, acc: 93.0632, loss_bbox: 0.2460, loss_mask: 0.2413, loss: 0.7441 2024-05-28 18:47:03,227 - mmdet - INFO - Epoch [5][6050/7330] lr: 1.000e-04, eta: 13:37:14, time: 0.954, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0479, loss_cls: 0.1967, acc: 92.9214, loss_bbox: 0.2443, loss_mask: 0.2402, loss: 0.7497 2024-05-28 18:47:48,997 - mmdet - INFO - Epoch [5][6100/7330] lr: 1.000e-04, eta: 13:36:27, time: 0.915, data_time: 0.038, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0449, loss_cls: 0.1813, acc: 93.3735, loss_bbox: 0.2310, loss_mask: 0.2386, loss: 0.7130 2024-05-28 18:48:34,492 - mmdet - INFO - Epoch [5][6150/7330] lr: 1.000e-04, eta: 13:35:38, time: 0.910, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0443, loss_cls: 0.1950, acc: 92.9563, loss_bbox: 0.2474, loss_mask: 0.2407, loss: 0.7462 2024-05-28 18:49:20,021 - mmdet - INFO - Epoch [5][6200/7330] lr: 1.000e-04, eta: 13:34:50, time: 0.911, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0442, loss_cls: 0.1876, acc: 93.2366, loss_bbox: 0.2371, loss_mask: 0.2401, loss: 0.7297 2024-05-28 18:50:06,173 - mmdet - INFO - Epoch [5][6250/7330] lr: 1.000e-04, eta: 13:34:03, time: 0.923, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0470, loss_cls: 0.1951, acc: 92.9077, loss_bbox: 0.2418, loss_mask: 0.2460, loss: 0.7505 2024-05-28 18:50:52,011 - mmdet - INFO - Epoch [5][6300/7330] lr: 1.000e-04, eta: 13:33:15, time: 0.917, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0466, loss_cls: 0.1956, acc: 92.8391, loss_bbox: 0.2468, loss_mask: 0.2466, loss: 0.7554 2024-05-28 18:51:37,715 - mmdet - INFO - Epoch [5][6350/7330] lr: 1.000e-04, eta: 13:32:27, time: 0.914, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0480, loss_cls: 0.1869, acc: 93.2698, loss_bbox: 0.2391, loss_mask: 0.2361, loss: 0.7309 2024-05-28 18:52:23,150 - mmdet - INFO - Epoch [5][6400/7330] lr: 1.000e-04, eta: 13:31:39, time: 0.909, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0430, loss_cls: 0.1824, acc: 93.3611, loss_bbox: 0.2329, loss_mask: 0.2386, loss: 0.7151 2024-05-28 18:53:09,060 - mmdet - INFO - Epoch [5][6450/7330] lr: 1.000e-04, eta: 13:30:51, time: 0.918, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0472, loss_cls: 0.1946, acc: 92.8860, loss_bbox: 0.2470, loss_mask: 0.2415, loss: 0.7484 2024-05-28 18:53:59,686 - mmdet - INFO - Epoch [5][6500/7330] lr: 1.000e-04, eta: 13:30:10, time: 1.012, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0475, loss_cls: 0.1971, acc: 92.8516, loss_bbox: 0.2423, loss_mask: 0.2348, loss: 0.7411 2024-05-28 18:54:45,255 - mmdet - INFO - Epoch [5][6550/7330] lr: 1.000e-04, eta: 13:29:22, time: 0.911, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0464, loss_cls: 0.1895, acc: 93.1428, loss_bbox: 0.2354, loss_mask: 0.2374, loss: 0.7268 2024-05-28 18:55:32,793 - mmdet - INFO - Epoch [5][6600/7330] lr: 1.000e-04, eta: 13:28:37, time: 0.951, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0431, loss_cls: 0.1910, acc: 93.0093, loss_bbox: 0.2352, loss_mask: 0.2390, loss: 0.7274 2024-05-28 18:56:18,423 - mmdet - INFO - Epoch [5][6650/7330] lr: 1.000e-04, eta: 13:27:49, time: 0.913, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0441, loss_cls: 0.1949, acc: 92.9380, loss_bbox: 0.2389, loss_mask: 0.2397, loss: 0.7370 2024-05-28 18:57:04,247 - mmdet - INFO - Epoch [5][6700/7330] lr: 1.000e-04, eta: 13:27:01, time: 0.917, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0428, loss_cls: 0.1915, acc: 93.0310, loss_bbox: 0.2390, loss_mask: 0.2405, loss: 0.7345 2024-05-28 18:57:54,106 - mmdet - INFO - Epoch [5][6750/7330] lr: 1.000e-04, eta: 13:26:16, time: 0.950, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0440, loss_cls: 0.1865, acc: 93.2595, loss_bbox: 0.2337, loss_mask: 0.2372, loss: 0.7210 2024-05-28 18:58:43,441 - mmdet - INFO - Epoch [5][6800/7330] lr: 1.000e-04, eta: 13:25:36, time: 1.034, data_time: 0.112, memory: 20925, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0447, loss_cls: 0.1934, acc: 92.9478, loss_bbox: 0.2441, loss_mask: 0.2451, loss: 0.7460 2024-05-28 18:59:28,830 - mmdet - INFO - Epoch [5][6850/7330] lr: 1.000e-04, eta: 13:24:48, time: 0.908, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0463, loss_cls: 0.1968, acc: 92.9705, loss_bbox: 0.2401, loss_mask: 0.2415, loss: 0.7455 2024-05-28 19:00:14,880 - mmdet - INFO - Epoch [5][6900/7330] lr: 1.000e-04, eta: 13:24:01, time: 0.921, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0454, loss_cls: 0.1998, acc: 92.6829, loss_bbox: 0.2491, loss_mask: 0.2408, loss: 0.7540 2024-05-28 19:01:00,507 - mmdet - INFO - Epoch [5][6950/7330] lr: 1.000e-04, eta: 13:23:13, time: 0.913, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0480, loss_cls: 0.1952, acc: 92.8105, loss_bbox: 0.2453, loss_mask: 0.2396, loss: 0.7492 2024-05-28 19:01:46,340 - mmdet - INFO - Epoch [5][7000/7330] lr: 1.000e-04, eta: 13:22:25, time: 0.917, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0463, loss_cls: 0.1924, acc: 92.9602, loss_bbox: 0.2448, loss_mask: 0.2451, loss: 0.7489 2024-05-28 19:02:32,422 - mmdet - INFO - Epoch [5][7050/7330] lr: 1.000e-04, eta: 13:21:37, time: 0.922, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0457, loss_cls: 0.1922, acc: 93.0605, loss_bbox: 0.2409, loss_mask: 0.2404, loss: 0.7388 2024-05-28 19:03:22,305 - mmdet - INFO - Epoch [5][7100/7330] lr: 1.000e-04, eta: 13:20:55, time: 0.998, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0462, loss_cls: 0.1916, acc: 92.9773, loss_bbox: 0.2401, loss_mask: 0.2436, loss: 0.7419 2024-05-28 19:04:07,730 - mmdet - INFO - Epoch [5][7150/7330] lr: 1.000e-04, eta: 13:20:07, time: 0.908, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0462, loss_cls: 0.1921, acc: 93.0610, loss_bbox: 0.2400, loss_mask: 0.2413, loss: 0.7406 2024-05-28 19:04:53,668 - mmdet - INFO - Epoch [5][7200/7330] lr: 1.000e-04, eta: 13:19:20, time: 0.919, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0466, loss_cls: 0.1892, acc: 93.0740, loss_bbox: 0.2395, loss_mask: 0.2381, loss: 0.7343 2024-05-28 19:05:39,175 - mmdet - INFO - Epoch [5][7250/7330] lr: 1.000e-04, eta: 13:18:31, time: 0.910, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0452, loss_cls: 0.1929, acc: 92.9714, loss_bbox: 0.2423, loss_mask: 0.2371, loss: 0.7373 2024-05-28 19:06:24,638 - mmdet - INFO - Epoch [5][7300/7330] lr: 1.000e-04, eta: 13:17:43, time: 0.909, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0459, loss_cls: 0.1984, acc: 92.7952, loss_bbox: 0.2441, loss_mask: 0.2333, loss: 0.7409 2024-05-28 19:06:53,027 - mmdet - INFO - Saving checkpoint at 5 epochs 2024-05-28 19:09:03,014 - mmdet - INFO - Evaluating bbox... 2024-05-28 19:09:28,685 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.454 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.699 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.507 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.283 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.504 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.612 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.577 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.577 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.577 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.391 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.623 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.732 2024-05-28 19:09:28,686 - mmdet - INFO - Evaluating segm... 2024-05-28 19:09:56,642 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.411 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.661 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.437 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.202 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.453 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.625 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.526 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.526 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.526 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.323 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.572 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.708 2024-05-28 19:09:57,027 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 19:09:57,028 - mmdet - INFO - Epoch(val) [5][625] bbox_mAP: 0.4540, bbox_mAP_50: 0.6990, bbox_mAP_75: 0.5070, bbox_mAP_s: 0.2830, bbox_mAP_m: 0.5040, bbox_mAP_l: 0.6120, bbox_mAP_copypaste: 0.454 0.699 0.507 0.283 0.504 0.612, segm_mAP: 0.4110, segm_mAP_50: 0.6610, segm_mAP_75: 0.4370, segm_mAP_s: 0.2020, segm_mAP_m: 0.4530, segm_mAP_l: 0.6250, segm_mAP_copypaste: 0.411 0.661 0.437 0.202 0.453 0.625 2024-05-28 19:10:51,856 - mmdet - INFO - Epoch [6][50/7330] lr: 1.000e-04, eta: 13:16:01, time: 1.095, data_time: 0.121, memory: 20925, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0469, loss_cls: 0.1894, acc: 93.0654, loss_bbox: 0.2380, loss_mask: 0.2339, loss: 0.7283 2024-05-28 19:11:38,623 - mmdet - INFO - Epoch [6][100/7330] lr: 1.000e-04, eta: 13:15:15, time: 0.935, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0442, loss_cls: 0.1853, acc: 93.1287, loss_bbox: 0.2414, loss_mask: 0.2384, loss: 0.7272 2024-05-28 19:12:24,807 - mmdet - INFO - Epoch [6][150/7330] lr: 1.000e-04, eta: 13:14:27, time: 0.924, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0466, loss_cls: 0.1927, acc: 92.9443, loss_bbox: 0.2461, loss_mask: 0.2374, loss: 0.7408 2024-05-28 19:13:12,803 - mmdet - INFO - Epoch [6][200/7330] lr: 1.000e-04, eta: 13:13:43, time: 0.960, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0424, loss_cls: 0.1700, acc: 93.6763, loss_bbox: 0.2226, loss_mask: 0.2337, loss: 0.6861 2024-05-28 19:13:59,174 - mmdet - INFO - Epoch [6][250/7330] lr: 1.000e-04, eta: 13:12:56, time: 0.927, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0421, loss_cls: 0.1748, acc: 93.5081, loss_bbox: 0.2294, loss_mask: 0.2345, loss: 0.6967 2024-05-28 19:14:45,301 - mmdet - INFO - Epoch [6][300/7330] lr: 1.000e-04, eta: 13:12:09, time: 0.923, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0455, loss_cls: 0.1873, acc: 93.0750, loss_bbox: 0.2377, loss_mask: 0.2322, loss: 0.7199 2024-05-28 19:15:31,156 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 19:15:31,157 - mmdet - INFO - Epoch [6][350/7330] lr: 1.000e-04, eta: 13:11:21, time: 0.917, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0452, loss_cls: 0.1837, acc: 93.1567, loss_bbox: 0.2364, loss_mask: 0.2352, loss: 0.7185 2024-05-28 19:16:16,769 - mmdet - INFO - Epoch [6][400/7330] lr: 1.000e-04, eta: 13:10:33, time: 0.912, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0422, loss_cls: 0.1808, acc: 93.3928, loss_bbox: 0.2332, loss_mask: 0.2358, loss: 0.7090 2024-05-28 19:17:02,721 - mmdet - INFO - Epoch [6][450/7330] lr: 1.000e-04, eta: 13:09:46, time: 0.920, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0424, loss_cls: 0.1771, acc: 93.4973, loss_bbox: 0.2319, loss_mask: 0.2289, loss: 0.6970 2024-05-28 19:17:48,274 - mmdet - INFO - Epoch [6][500/7330] lr: 1.000e-04, eta: 13:08:58, time: 0.911, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0423, loss_cls: 0.1853, acc: 93.1646, loss_bbox: 0.2368, loss_mask: 0.2349, loss: 0.7163 2024-05-28 19:18:34,697 - mmdet - INFO - Epoch [6][550/7330] lr: 1.000e-04, eta: 13:08:11, time: 0.928, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0456, loss_cls: 0.1814, acc: 93.3491, loss_bbox: 0.2328, loss_mask: 0.2329, loss: 0.7120 2024-05-28 19:19:24,590 - mmdet - INFO - Epoch [6][600/7330] lr: 1.000e-04, eta: 13:07:29, time: 0.998, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0468, loss_cls: 0.1859, acc: 93.1448, loss_bbox: 0.2397, loss_mask: 0.2416, loss: 0.7323 2024-05-28 19:20:12,949 - mmdet - INFO - Epoch [6][650/7330] lr: 1.000e-04, eta: 13:06:45, time: 0.967, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0399, loss_cls: 0.1749, acc: 93.5857, loss_bbox: 0.2206, loss_mask: 0.2253, loss: 0.6783 2024-05-28 19:21:02,023 - mmdet - INFO - Epoch [6][700/7330] lr: 1.000e-04, eta: 13:06:01, time: 0.981, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0458, loss_cls: 0.1847, acc: 93.1555, loss_bbox: 0.2335, loss_mask: 0.2387, loss: 0.7253 2024-05-28 19:21:48,412 - mmdet - INFO - Epoch [6][750/7330] lr: 1.000e-04, eta: 13:05:15, time: 0.928, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0442, loss_cls: 0.1966, acc: 92.7534, loss_bbox: 0.2449, loss_mask: 0.2407, loss: 0.7446 2024-05-28 19:22:34,672 - mmdet - INFO - Epoch [6][800/7330] lr: 1.000e-04, eta: 13:04:27, time: 0.925, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0438, loss_cls: 0.1821, acc: 93.2373, loss_bbox: 0.2318, loss_mask: 0.2340, loss: 0.7087 2024-05-28 19:23:21,032 - mmdet - INFO - Epoch [6][850/7330] lr: 1.000e-04, eta: 13:03:41, time: 0.927, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0445, loss_cls: 0.1870, acc: 93.2043, loss_bbox: 0.2370, loss_mask: 0.2336, loss: 0.7205 2024-05-28 19:24:06,913 - mmdet - INFO - Epoch [6][900/7330] lr: 1.000e-04, eta: 13:02:53, time: 0.918, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0463, loss_cls: 0.1871, acc: 93.1196, loss_bbox: 0.2399, loss_mask: 0.2339, loss: 0.7263 2024-05-28 19:24:52,773 - mmdet - INFO - Epoch [6][950/7330] lr: 1.000e-04, eta: 13:02:05, time: 0.917, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0464, loss_cls: 0.1904, acc: 93.0017, loss_bbox: 0.2401, loss_mask: 0.2390, loss: 0.7351 2024-05-28 19:25:38,811 - mmdet - INFO - Epoch [6][1000/7330] lr: 1.000e-04, eta: 13:01:18, time: 0.921, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0443, loss_cls: 0.1800, acc: 93.3596, loss_bbox: 0.2285, loss_mask: 0.2309, loss: 0.7018 2024-05-28 19:26:28,199 - mmdet - INFO - Epoch [6][1050/7330] lr: 1.000e-04, eta: 13:00:35, time: 0.988, data_time: 0.065, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0450, loss_cls: 0.1910, acc: 92.9927, loss_bbox: 0.2398, loss_mask: 0.2327, loss: 0.7258 2024-05-28 19:27:16,897 - mmdet - INFO - Epoch [6][1100/7330] lr: 1.000e-04, eta: 12:59:51, time: 0.974, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0444, loss_cls: 0.1875, acc: 93.1357, loss_bbox: 0.2351, loss_mask: 0.2329, loss: 0.7189 2024-05-28 19:28:02,969 - mmdet - INFO - Epoch [6][1150/7330] lr: 1.000e-04, eta: 12:59:04, time: 0.921, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0450, loss_cls: 0.1767, acc: 93.5347, loss_bbox: 0.2297, loss_mask: 0.2359, loss: 0.7070 2024-05-28 19:28:49,969 - mmdet - INFO - Epoch [6][1200/7330] lr: 1.000e-04, eta: 12:58:18, time: 0.940, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0453, loss_cls: 0.1887, acc: 93.0684, loss_bbox: 0.2383, loss_mask: 0.2335, loss: 0.7233 2024-05-28 19:29:38,139 - mmdet - INFO - Epoch [6][1250/7330] lr: 1.000e-04, eta: 12:57:34, time: 0.963, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0432, loss_cls: 0.1818, acc: 93.3765, loss_bbox: 0.2268, loss_mask: 0.2345, loss: 0.7037 2024-05-28 19:30:23,416 - mmdet - INFO - Epoch [6][1300/7330] lr: 1.000e-04, eta: 12:56:45, time: 0.906, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0405, loss_cls: 0.1781, acc: 93.4949, loss_bbox: 0.2282, loss_mask: 0.2339, loss: 0.6961 2024-05-28 19:31:09,356 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 19:31:09,356 - mmdet - INFO - Epoch [6][1350/7330] lr: 1.000e-04, eta: 12:55:58, time: 0.919, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0444, loss_cls: 0.1849, acc: 93.1572, loss_bbox: 0.2371, loss_mask: 0.2425, loss: 0.7264 2024-05-28 19:31:56,165 - mmdet - INFO - Epoch [6][1400/7330] lr: 1.000e-04, eta: 12:55:11, time: 0.936, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0474, loss_cls: 0.1919, acc: 92.9895, loss_bbox: 0.2440, loss_mask: 0.2364, loss: 0.7398 2024-05-28 19:32:46,466 - mmdet - INFO - Epoch [6][1450/7330] lr: 1.000e-04, eta: 12:54:30, time: 1.006, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0422, loss_cls: 0.1776, acc: 93.4460, loss_bbox: 0.2253, loss_mask: 0.2296, loss: 0.6921 2024-05-28 19:33:32,791 - mmdet - INFO - Epoch [6][1500/7330] lr: 1.000e-04, eta: 12:53:43, time: 0.926, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0466, loss_cls: 0.1938, acc: 92.8438, loss_bbox: 0.2455, loss_mask: 0.2361, loss: 0.7417 2024-05-28 19:34:19,113 - mmdet - INFO - Epoch [6][1550/7330] lr: 1.000e-04, eta: 12:52:56, time: 0.926, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0489, loss_cls: 0.1897, acc: 92.9844, loss_bbox: 0.2436, loss_mask: 0.2366, loss: 0.7386 2024-05-28 19:35:05,372 - mmdet - INFO - Epoch [6][1600/7330] lr: 1.000e-04, eta: 12:52:09, time: 0.925, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0440, loss_cls: 0.1825, acc: 93.2212, loss_bbox: 0.2364, loss_mask: 0.2356, loss: 0.7171 2024-05-28 19:35:55,756 - mmdet - INFO - Epoch [6][1650/7330] lr: 1.000e-04, eta: 12:51:27, time: 1.008, data_time: 0.068, memory: 20925, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0422, loss_cls: 0.1767, acc: 93.5298, loss_bbox: 0.2248, loss_mask: 0.2264, loss: 0.6861 2024-05-28 19:36:44,173 - mmdet - INFO - Epoch [6][1700/7330] lr: 1.000e-04, eta: 12:50:43, time: 0.968, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0437, loss_cls: 0.1758, acc: 93.5820, loss_bbox: 0.2287, loss_mask: 0.2312, loss: 0.6970 2024-05-28 19:37:33,197 - mmdet - INFO - Epoch [6][1750/7330] lr: 1.000e-04, eta: 12:49:59, time: 0.980, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0446, loss_cls: 0.1834, acc: 93.3213, loss_bbox: 0.2341, loss_mask: 0.2340, loss: 0.7157 2024-05-28 19:38:20,116 - mmdet - INFO - Epoch [6][1800/7330] lr: 1.000e-04, eta: 12:49:13, time: 0.938, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0455, loss_cls: 0.1865, acc: 93.0813, loss_bbox: 0.2418, loss_mask: 0.2387, loss: 0.7316 2024-05-28 19:39:06,274 - mmdet - INFO - Epoch [6][1850/7330] lr: 1.000e-04, eta: 12:48:26, time: 0.923, data_time: 0.064, memory: 20925, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0437, loss_cls: 0.1816, acc: 93.1782, loss_bbox: 0.2377, loss_mask: 0.2358, loss: 0.7156 2024-05-28 19:39:53,155 - mmdet - INFO - Epoch [6][1900/7330] lr: 1.000e-04, eta: 12:47:40, time: 0.938, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0452, loss_cls: 0.1893, acc: 92.9087, loss_bbox: 0.2424, loss_mask: 0.2341, loss: 0.7305 2024-05-28 19:40:39,112 - mmdet - INFO - Epoch [6][1950/7330] lr: 1.000e-04, eta: 12:46:52, time: 0.919, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0430, loss_cls: 0.1829, acc: 93.2300, loss_bbox: 0.2359, loss_mask: 0.2357, loss: 0.7137 2024-05-28 19:41:25,492 - mmdet - INFO - Epoch [6][2000/7330] lr: 1.000e-04, eta: 12:46:05, time: 0.928, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0456, loss_cls: 0.1926, acc: 92.9333, loss_bbox: 0.2449, loss_mask: 0.2417, loss: 0.7427 2024-05-28 19:42:11,561 - mmdet - INFO - Epoch [6][2050/7330] lr: 1.000e-04, eta: 12:45:18, time: 0.921, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0443, loss_cls: 0.1831, acc: 93.2368, loss_bbox: 0.2373, loss_mask: 0.2350, loss: 0.7180 2024-05-28 19:43:01,050 - mmdet - INFO - Epoch [6][2100/7330] lr: 1.000e-04, eta: 12:44:35, time: 0.990, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0427, loss_cls: 0.1878, acc: 93.1921, loss_bbox: 0.2323, loss_mask: 0.2334, loss: 0.7145 2024-05-28 19:43:49,876 - mmdet - INFO - Epoch [6][2150/7330] lr: 1.000e-04, eta: 12:43:51, time: 0.976, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0449, loss_cls: 0.1871, acc: 93.0325, loss_bbox: 0.2370, loss_mask: 0.2405, loss: 0.7287 2024-05-28 19:44:36,999 - mmdet - INFO - Epoch [6][2200/7330] lr: 1.000e-04, eta: 12:43:05, time: 0.943, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0472, loss_cls: 0.1928, acc: 92.8809, loss_bbox: 0.2466, loss_mask: 0.2376, loss: 0.7442 2024-05-28 19:45:23,717 - mmdet - INFO - Epoch [6][2250/7330] lr: 1.000e-04, eta: 12:42:19, time: 0.934, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0480, loss_cls: 0.1940, acc: 92.8594, loss_bbox: 0.2443, loss_mask: 0.2356, loss: 0.7413 2024-05-28 19:46:13,108 - mmdet - INFO - Epoch [6][2300/7330] lr: 1.000e-04, eta: 12:41:36, time: 0.988, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0439, loss_cls: 0.1831, acc: 93.2510, loss_bbox: 0.2298, loss_mask: 0.2360, loss: 0.7115 2024-05-28 19:46:59,512 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 19:46:59,512 - mmdet - INFO - Epoch [6][2350/7330] lr: 1.000e-04, eta: 12:40:49, time: 0.928, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0414, loss_cls: 0.1748, acc: 93.5808, loss_bbox: 0.2232, loss_mask: 0.2270, loss: 0.6819 2024-05-28 19:47:45,741 - mmdet - INFO - Epoch [6][2400/7330] lr: 1.000e-04, eta: 12:40:02, time: 0.925, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0422, loss_cls: 0.1814, acc: 93.3767, loss_bbox: 0.2292, loss_mask: 0.2292, loss: 0.6997 2024-05-28 19:48:32,299 - mmdet - INFO - Epoch [6][2450/7330] lr: 1.000e-04, eta: 12:39:15, time: 0.931, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0464, loss_cls: 0.1925, acc: 92.9070, loss_bbox: 0.2431, loss_mask: 0.2327, loss: 0.7337 2024-05-28 19:49:22,675 - mmdet - INFO - Epoch [6][2500/7330] lr: 1.000e-04, eta: 12:38:33, time: 1.008, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0448, loss_cls: 0.1879, acc: 93.1035, loss_bbox: 0.2405, loss_mask: 0.2319, loss: 0.7233 2024-05-28 19:50:08,669 - mmdet - INFO - Epoch [6][2550/7330] lr: 1.000e-04, eta: 12:37:46, time: 0.920, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0437, loss_cls: 0.1830, acc: 93.2461, loss_bbox: 0.2313, loss_mask: 0.2303, loss: 0.7056 2024-05-28 19:50:55,066 - mmdet - INFO - Epoch [6][2600/7330] lr: 1.000e-04, eta: 12:36:59, time: 0.928, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0446, loss_cls: 0.1897, acc: 93.0708, loss_bbox: 0.2406, loss_mask: 0.2397, loss: 0.7316 2024-05-28 19:51:40,890 - mmdet - INFO - Epoch [6][2650/7330] lr: 1.000e-04, eta: 12:36:11, time: 0.916, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0435, loss_cls: 0.1806, acc: 93.2717, loss_bbox: 0.2354, loss_mask: 0.2287, loss: 0.7049 2024-05-28 19:52:29,899 - mmdet - INFO - Epoch [6][2700/7330] lr: 1.000e-04, eta: 12:35:27, time: 0.980, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0459, loss_cls: 0.1843, acc: 93.1736, loss_bbox: 0.2361, loss_mask: 0.2322, loss: 0.7169 2024-05-28 19:53:17,755 - mmdet - INFO - Epoch [6][2750/7330] lr: 1.000e-04, eta: 12:34:42, time: 0.957, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0439, loss_cls: 0.1898, acc: 92.9978, loss_bbox: 0.2415, loss_mask: 0.2408, loss: 0.7345 2024-05-28 19:54:06,180 - mmdet - INFO - Epoch [6][2800/7330] lr: 1.000e-04, eta: 12:33:58, time: 0.968, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0439, loss_cls: 0.1784, acc: 93.4050, loss_bbox: 0.2282, loss_mask: 0.2260, loss: 0.6938 2024-05-28 19:54:52,777 - mmdet - INFO - Epoch [6][2850/7330] lr: 1.000e-04, eta: 12:33:11, time: 0.932, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0442, loss_cls: 0.1845, acc: 93.1472, loss_bbox: 0.2381, loss_mask: 0.2400, loss: 0.7268 2024-05-28 19:55:38,780 - mmdet - INFO - Epoch [6][2900/7330] lr: 1.000e-04, eta: 12:32:24, time: 0.920, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0425, loss_cls: 0.1766, acc: 93.4421, loss_bbox: 0.2302, loss_mask: 0.2324, loss: 0.6991 2024-05-28 19:56:25,715 - mmdet - INFO - Epoch [6][2950/7330] lr: 1.000e-04, eta: 12:31:38, time: 0.939, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0480, loss_cls: 0.1927, acc: 92.9250, loss_bbox: 0.2467, loss_mask: 0.2410, loss: 0.7481 2024-05-28 19:57:11,762 - mmdet - INFO - Epoch [6][3000/7330] lr: 1.000e-04, eta: 12:30:50, time: 0.921, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0405, loss_cls: 0.1720, acc: 93.6812, loss_bbox: 0.2206, loss_mask: 0.2295, loss: 0.6800 2024-05-28 19:57:58,328 - mmdet - INFO - Epoch [6][3050/7330] lr: 1.000e-04, eta: 12:30:04, time: 0.931, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0451, loss_cls: 0.1772, acc: 93.4763, loss_bbox: 0.2299, loss_mask: 0.2301, loss: 0.7006 2024-05-28 19:58:44,535 - mmdet - INFO - Epoch [6][3100/7330] lr: 1.000e-04, eta: 12:29:16, time: 0.924, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0446, loss_cls: 0.1783, acc: 93.5193, loss_bbox: 0.2264, loss_mask: 0.2385, loss: 0.7068 2024-05-28 19:59:33,207 - mmdet - INFO - Epoch [6][3150/7330] lr: 1.000e-04, eta: 12:28:32, time: 0.973, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0441, loss_cls: 0.1918, acc: 92.9851, loss_bbox: 0.2428, loss_mask: 0.2424, loss: 0.7416 2024-05-28 20:00:21,407 - mmdet - INFO - Epoch [6][3200/7330] lr: 1.000e-04, eta: 12:27:48, time: 0.964, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0455, loss_cls: 0.1884, acc: 93.0151, loss_bbox: 0.2380, loss_mask: 0.2377, loss: 0.7284 2024-05-28 20:01:07,009 - mmdet - INFO - Epoch [6][3250/7330] lr: 1.000e-04, eta: 12:27:00, time: 0.912, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0413, loss_cls: 0.1851, acc: 93.2097, loss_bbox: 0.2315, loss_mask: 0.2356, loss: 0.7115 2024-05-28 20:01:53,526 - mmdet - INFO - Epoch [6][3300/7330] lr: 1.000e-04, eta: 12:26:13, time: 0.930, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0436, loss_cls: 0.1784, acc: 93.5037, loss_bbox: 0.2269, loss_mask: 0.2331, loss: 0.6998 2024-05-28 20:02:42,242 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 20:02:42,242 - mmdet - INFO - Epoch [6][3350/7330] lr: 1.000e-04, eta: 12:25:29, time: 0.974, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0433, loss_cls: 0.1854, acc: 93.2993, loss_bbox: 0.2314, loss_mask: 0.2360, loss: 0.7135 2024-05-28 20:03:27,680 - mmdet - INFO - Epoch [6][3400/7330] lr: 1.000e-04, eta: 12:24:41, time: 0.908, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0462, loss_cls: 0.1829, acc: 93.2976, loss_bbox: 0.2331, loss_mask: 0.2339, loss: 0.7158 2024-05-28 20:04:13,576 - mmdet - INFO - Epoch [6][3450/7330] lr: 1.000e-04, eta: 12:23:53, time: 0.918, data_time: 0.037, memory: 20925, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0411, loss_cls: 0.1724, acc: 93.6694, loss_bbox: 0.2190, loss_mask: 0.2237, loss: 0.6725 2024-05-28 20:04:59,801 - mmdet - INFO - Epoch [6][3500/7330] lr: 1.000e-04, eta: 12:23:06, time: 0.924, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0455, loss_cls: 0.1952, acc: 92.8423, loss_bbox: 0.2494, loss_mask: 0.2417, loss: 0.7481 2024-05-28 20:05:50,383 - mmdet - INFO - Epoch [6][3550/7330] lr: 1.000e-04, eta: 12:22:24, time: 1.012, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0453, loss_cls: 0.1942, acc: 92.9792, loss_bbox: 0.2418, loss_mask: 0.2415, loss: 0.7416 2024-05-28 20:06:36,427 - mmdet - INFO - Epoch [6][3600/7330] lr: 1.000e-04, eta: 12:21:37, time: 0.921, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0439, loss_cls: 0.1962, acc: 92.7988, loss_bbox: 0.2442, loss_mask: 0.2384, loss: 0.7426 2024-05-28 20:07:22,959 - mmdet - INFO - Epoch [6][3650/7330] lr: 1.000e-04, eta: 12:20:50, time: 0.931, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0443, loss_cls: 0.1865, acc: 93.0999, loss_bbox: 0.2423, loss_mask: 0.2396, loss: 0.7318 2024-05-28 20:08:09,560 - mmdet - INFO - Epoch [6][3700/7330] lr: 1.000e-04, eta: 12:20:03, time: 0.932, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0436, loss_cls: 0.1879, acc: 93.1421, loss_bbox: 0.2351, loss_mask: 0.2359, loss: 0.7209 2024-05-28 20:08:58,567 - mmdet - INFO - Epoch [6][3750/7330] lr: 1.000e-04, eta: 12:19:20, time: 0.980, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0461, loss_cls: 0.1765, acc: 93.4988, loss_bbox: 0.2226, loss_mask: 0.2290, loss: 0.6921 2024-05-28 20:09:46,576 - mmdet - INFO - Epoch [6][3800/7330] lr: 1.000e-04, eta: 12:18:35, time: 0.960, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0445, loss_cls: 0.1825, acc: 93.2820, loss_bbox: 0.2358, loss_mask: 0.2383, loss: 0.7189 2024-05-28 20:10:35,592 - mmdet - INFO - Epoch [6][3850/7330] lr: 1.000e-04, eta: 12:17:51, time: 0.980, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0442, loss_cls: 0.1905, acc: 93.0103, loss_bbox: 0.2409, loss_mask: 0.2326, loss: 0.7254 2024-05-28 20:11:22,290 - mmdet - INFO - Epoch [6][3900/7330] lr: 1.000e-04, eta: 12:17:04, time: 0.934, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0464, loss_cls: 0.1857, acc: 93.1335, loss_bbox: 0.2368, loss_mask: 0.2365, loss: 0.7241 2024-05-28 20:12:08,393 - mmdet - INFO - Epoch [6][3950/7330] lr: 1.000e-04, eta: 12:16:17, time: 0.922, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0466, loss_cls: 0.1816, acc: 93.2734, loss_bbox: 0.2356, loss_mask: 0.2366, loss: 0.7191 2024-05-28 20:12:54,710 - mmdet - INFO - Epoch [6][4000/7330] lr: 1.000e-04, eta: 12:15:30, time: 0.926, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0448, loss_cls: 0.1816, acc: 93.3560, loss_bbox: 0.2295, loss_mask: 0.2388, loss: 0.7138 2024-05-28 20:13:40,828 - mmdet - INFO - Epoch [6][4050/7330] lr: 1.000e-04, eta: 12:14:43, time: 0.922, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0436, loss_cls: 0.1845, acc: 93.3569, loss_bbox: 0.2282, loss_mask: 0.2346, loss: 0.7095 2024-05-28 20:14:26,858 - mmdet - INFO - Epoch [6][4100/7330] lr: 1.000e-04, eta: 12:13:55, time: 0.921, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0470, loss_cls: 0.1891, acc: 92.9941, loss_bbox: 0.2363, loss_mask: 0.2357, loss: 0.7272 2024-05-28 20:15:13,377 - mmdet - INFO - Epoch [6][4150/7330] lr: 1.000e-04, eta: 12:13:08, time: 0.930, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0463, loss_cls: 0.1874, acc: 93.0994, loss_bbox: 0.2378, loss_mask: 0.2385, loss: 0.7289 2024-05-28 20:16:02,471 - mmdet - INFO - Epoch [6][4200/7330] lr: 1.000e-04, eta: 12:12:25, time: 0.982, data_time: 0.064, memory: 20925, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0460, loss_cls: 0.1839, acc: 93.2642, loss_bbox: 0.2330, loss_mask: 0.2364, loss: 0.7176 2024-05-28 20:16:50,871 - mmdet - INFO - Epoch [6][4250/7330] lr: 1.000e-04, eta: 12:11:40, time: 0.968, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0445, loss_cls: 0.1802, acc: 93.3628, loss_bbox: 0.2310, loss_mask: 0.2379, loss: 0.7109 2024-05-28 20:17:36,892 - mmdet - INFO - Epoch [6][4300/7330] lr: 1.000e-04, eta: 12:10:53, time: 0.920, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0447, loss_cls: 0.1824, acc: 93.2202, loss_bbox: 0.2341, loss_mask: 0.2352, loss: 0.7150 2024-05-28 20:18:22,816 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 20:18:22,816 - mmdet - INFO - Epoch [6][4350/7330] lr: 1.000e-04, eta: 12:10:05, time: 0.919, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0435, loss_cls: 0.1814, acc: 93.2380, loss_bbox: 0.2330, loss_mask: 0.2401, loss: 0.7158 2024-05-28 20:19:11,363 - mmdet - INFO - Epoch [6][4400/7330] lr: 1.000e-04, eta: 12:09:21, time: 0.971, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0439, loss_cls: 0.1888, acc: 93.1372, loss_bbox: 0.2359, loss_mask: 0.2359, loss: 0.7221 2024-05-28 20:19:57,620 - mmdet - INFO - Epoch [6][4450/7330] lr: 1.000e-04, eta: 12:08:34, time: 0.925, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0449, loss_cls: 0.1839, acc: 93.2732, loss_bbox: 0.2329, loss_mask: 0.2347, loss: 0.7136 2024-05-28 20:20:43,919 - mmdet - INFO - Epoch [6][4500/7330] lr: 1.000e-04, eta: 12:07:47, time: 0.926, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0447, loss_cls: 0.1858, acc: 93.0481, loss_bbox: 0.2375, loss_mask: 0.2352, loss: 0.7204 2024-05-28 20:21:29,898 - mmdet - INFO - Epoch [6][4550/7330] lr: 1.000e-04, eta: 12:06:59, time: 0.920, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0426, loss_cls: 0.1794, acc: 93.4502, loss_bbox: 0.2271, loss_mask: 0.2341, loss: 0.7000 2024-05-28 20:22:20,648 - mmdet - INFO - Epoch [6][4600/7330] lr: 1.000e-04, eta: 12:06:17, time: 1.015, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0422, loss_cls: 0.1759, acc: 93.6267, loss_bbox: 0.2197, loss_mask: 0.2201, loss: 0.6754 2024-05-28 20:23:07,020 - mmdet - INFO - Epoch [6][4650/7330] lr: 1.000e-04, eta: 12:05:30, time: 0.927, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0428, loss_cls: 0.1788, acc: 93.4868, loss_bbox: 0.2240, loss_mask: 0.2405, loss: 0.7050 2024-05-28 20:23:52,218 - mmdet - INFO - Epoch [6][4700/7330] lr: 1.000e-04, eta: 12:04:42, time: 0.904, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0416, loss_cls: 0.1825, acc: 93.2715, loss_bbox: 0.2314, loss_mask: 0.2363, loss: 0.7082 2024-05-28 20:24:38,460 - mmdet - INFO - Epoch [6][4750/7330] lr: 1.000e-04, eta: 12:03:55, time: 0.925, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0439, loss_cls: 0.1838, acc: 93.2163, loss_bbox: 0.2341, loss_mask: 0.2319, loss: 0.7120 2024-05-28 20:25:26,638 - mmdet - INFO - Epoch [6][4800/7330] lr: 1.000e-04, eta: 12:03:10, time: 0.964, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0450, loss_cls: 0.1748, acc: 93.5525, loss_bbox: 0.2283, loss_mask: 0.2271, loss: 0.6927 2024-05-28 20:26:14,786 - mmdet - INFO - Epoch [6][4850/7330] lr: 1.000e-04, eta: 12:02:25, time: 0.963, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0415, loss_cls: 0.1736, acc: 93.6777, loss_bbox: 0.2207, loss_mask: 0.2307, loss: 0.6830 2024-05-28 20:27:03,393 - mmdet - INFO - Epoch [6][4900/7330] lr: 1.000e-04, eta: 12:01:41, time: 0.972, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0439, loss_cls: 0.1842, acc: 93.2441, loss_bbox: 0.2365, loss_mask: 0.2391, loss: 0.7225 2024-05-28 20:27:49,162 - mmdet - INFO - Epoch [6][4950/7330] lr: 1.000e-04, eta: 12:00:53, time: 0.915, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0459, loss_cls: 0.1872, acc: 93.0574, loss_bbox: 0.2349, loss_mask: 0.2375, loss: 0.7242 2024-05-28 20:28:35,119 - mmdet - INFO - Epoch [6][5000/7330] lr: 1.000e-04, eta: 12:00:06, time: 0.919, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0418, loss_cls: 0.1769, acc: 93.5522, loss_bbox: 0.2231, loss_mask: 0.2334, loss: 0.6924 2024-05-28 20:29:21,601 - mmdet - INFO - Epoch [6][5050/7330] lr: 1.000e-04, eta: 11:59:19, time: 0.930, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0427, loss_cls: 0.1867, acc: 93.0212, loss_bbox: 0.2385, loss_mask: 0.2311, loss: 0.7174 2024-05-28 20:30:07,472 - mmdet - INFO - Epoch [6][5100/7330] lr: 1.000e-04, eta: 11:58:31, time: 0.917, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0416, loss_cls: 0.1826, acc: 93.3706, loss_bbox: 0.2322, loss_mask: 0.2365, loss: 0.7102 2024-05-28 20:30:53,666 - mmdet - INFO - Epoch [6][5150/7330] lr: 1.000e-04, eta: 11:57:44, time: 0.924, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0443, loss_cls: 0.1858, acc: 93.1731, loss_bbox: 0.2351, loss_mask: 0.2337, loss: 0.7180 2024-05-28 20:31:40,484 - mmdet - INFO - Epoch [6][5200/7330] lr: 1.000e-04, eta: 11:56:58, time: 0.936, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0441, loss_cls: 0.1858, acc: 93.2122, loss_bbox: 0.2281, loss_mask: 0.2299, loss: 0.7066 2024-05-28 20:32:29,493 - mmdet - INFO - Epoch [6][5250/7330] lr: 1.000e-04, eta: 11:56:14, time: 0.980, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0440, loss_cls: 0.1912, acc: 93.0151, loss_bbox: 0.2379, loss_mask: 0.2412, loss: 0.7319 2024-05-28 20:33:17,831 - mmdet - INFO - Epoch [6][5300/7330] lr: 1.000e-04, eta: 11:55:29, time: 0.967, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0445, loss_cls: 0.1803, acc: 93.5566, loss_bbox: 0.2157, loss_mask: 0.2321, loss: 0.6912 2024-05-28 20:34:03,893 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 20:34:03,893 - mmdet - INFO - Epoch [6][5350/7330] lr: 1.000e-04, eta: 11:54:41, time: 0.921, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0430, loss_cls: 0.1845, acc: 93.3113, loss_bbox: 0.2264, loss_mask: 0.2365, loss: 0.7082 2024-05-28 20:34:49,994 - mmdet - INFO - Epoch [6][5400/7330] lr: 1.000e-04, eta: 11:53:54, time: 0.922, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0434, loss_cls: 0.1818, acc: 93.3704, loss_bbox: 0.2288, loss_mask: 0.2387, loss: 0.7101 2024-05-28 20:35:38,303 - mmdet - INFO - Epoch [6][5450/7330] lr: 1.000e-04, eta: 11:53:09, time: 0.966, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0413, loss_cls: 0.1832, acc: 93.1721, loss_bbox: 0.2271, loss_mask: 0.2309, loss: 0.6999 2024-05-28 20:36:25,033 - mmdet - INFO - Epoch [6][5500/7330] lr: 1.000e-04, eta: 11:52:23, time: 0.935, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0453, loss_cls: 0.1874, acc: 93.1836, loss_bbox: 0.2362, loss_mask: 0.2373, loss: 0.7237 2024-05-28 20:37:11,141 - mmdet - INFO - Epoch [6][5550/7330] lr: 1.000e-04, eta: 11:51:36, time: 0.922, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0440, loss_cls: 0.1923, acc: 92.9006, loss_bbox: 0.2440, loss_mask: 0.2399, loss: 0.7383 2024-05-28 20:37:56,785 - mmdet - INFO - Epoch [6][5600/7330] lr: 1.000e-04, eta: 11:50:48, time: 0.913, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0426, loss_cls: 0.1780, acc: 93.5525, loss_bbox: 0.2243, loss_mask: 0.2305, loss: 0.6945 2024-05-28 20:38:47,424 - mmdet - INFO - Epoch [6][5650/7330] lr: 1.000e-04, eta: 11:50:05, time: 1.013, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0425, loss_cls: 0.1862, acc: 93.1013, loss_bbox: 0.2326, loss_mask: 0.2372, loss: 0.7164 2024-05-28 20:39:33,525 - mmdet - INFO - Epoch [6][5700/7330] lr: 1.000e-04, eta: 11:49:18, time: 0.922, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0444, loss_cls: 0.1882, acc: 93.0891, loss_bbox: 0.2364, loss_mask: 0.2350, loss: 0.7217 2024-05-28 20:40:19,985 - mmdet - INFO - Epoch [6][5750/7330] lr: 1.000e-04, eta: 11:48:31, time: 0.929, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0431, loss_cls: 0.1859, acc: 93.0801, loss_bbox: 0.2385, loss_mask: 0.2383, loss: 0.7245 2024-05-28 20:41:06,063 - mmdet - INFO - Epoch [6][5800/7330] lr: 1.000e-04, eta: 11:47:44, time: 0.922, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0470, loss_cls: 0.1929, acc: 92.8352, loss_bbox: 0.2399, loss_mask: 0.2388, loss: 0.7362 2024-05-28 20:41:51,507 - mmdet - INFO - Epoch [6][5850/7330] lr: 1.000e-04, eta: 11:46:56, time: 0.909, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0454, loss_cls: 0.1911, acc: 93.0659, loss_bbox: 0.2380, loss_mask: 0.2400, loss: 0.7326 2024-05-28 20:42:42,320 - mmdet - INFO - Epoch [6][5900/7330] lr: 1.000e-04, eta: 11:46:14, time: 1.016, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0447, loss_cls: 0.1852, acc: 93.0957, loss_bbox: 0.2366, loss_mask: 0.2388, loss: 0.7238 2024-05-28 20:43:30,191 - mmdet - INFO - Epoch [6][5950/7330] lr: 1.000e-04, eta: 11:45:26, time: 0.914, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0434, loss_cls: 0.1865, acc: 93.2751, loss_bbox: 0.2382, loss_mask: 0.2356, loss: 0.7209 2024-05-28 20:44:16,842 - mmdet - INFO - Epoch [6][6000/7330] lr: 1.000e-04, eta: 11:44:42, time: 0.976, data_time: 0.094, memory: 20925, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0449, loss_cls: 0.1918, acc: 92.9729, loss_bbox: 0.2416, loss_mask: 0.2427, loss: 0.7400 2024-05-28 20:45:03,705 - mmdet - INFO - Epoch [6][6050/7330] lr: 1.000e-04, eta: 11:43:55, time: 0.937, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0445, loss_cls: 0.1847, acc: 93.2512, loss_bbox: 0.2361, loss_mask: 0.2408, loss: 0.7237 2024-05-28 20:45:50,080 - mmdet - INFO - Epoch [6][6100/7330] lr: 1.000e-04, eta: 11:43:08, time: 0.927, data_time: 0.063, memory: 20925, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0453, loss_cls: 0.1857, acc: 93.0374, loss_bbox: 0.2392, loss_mask: 0.2393, loss: 0.7274 2024-05-28 20:46:36,600 - mmdet - INFO - Epoch [6][6150/7330] lr: 1.000e-04, eta: 11:42:22, time: 0.930, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0448, loss_cls: 0.1870, acc: 93.0627, loss_bbox: 0.2368, loss_mask: 0.2312, loss: 0.7165 2024-05-28 20:47:22,262 - mmdet - INFO - Epoch [6][6200/7330] lr: 1.000e-04, eta: 11:41:34, time: 0.913, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0435, loss_cls: 0.1854, acc: 93.2195, loss_bbox: 0.2323, loss_mask: 0.2387, loss: 0.7177 2024-05-28 20:48:08,709 - mmdet - INFO - Epoch [6][6250/7330] lr: 1.000e-04, eta: 11:40:47, time: 0.929, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0456, loss_cls: 0.1875, acc: 93.0913, loss_bbox: 0.2370, loss_mask: 0.2353, loss: 0.7230 2024-05-28 20:48:57,623 - mmdet - INFO - Epoch [6][6300/7330] lr: 1.000e-04, eta: 11:40:03, time: 0.978, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0418, loss_cls: 0.1839, acc: 93.3044, loss_bbox: 0.2295, loss_mask: 0.2288, loss: 0.7024 2024-05-28 20:49:45,916 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 20:49:45,916 - mmdet - INFO - Epoch [6][6350/7330] lr: 1.000e-04, eta: 11:39:18, time: 0.966, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0469, loss_cls: 0.1869, acc: 93.1260, loss_bbox: 0.2381, loss_mask: 0.2330, loss: 0.7232 2024-05-28 20:50:32,454 - mmdet - INFO - Epoch [6][6400/7330] lr: 1.000e-04, eta: 11:38:31, time: 0.931, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0442, loss_cls: 0.1879, acc: 93.1533, loss_bbox: 0.2341, loss_mask: 0.2349, loss: 0.7194 2024-05-28 20:51:18,466 - mmdet - INFO - Epoch [6][6450/7330] lr: 1.000e-04, eta: 11:37:44, time: 0.920, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0439, loss_cls: 0.1870, acc: 93.0239, loss_bbox: 0.2444, loss_mask: 0.2343, loss: 0.7271 2024-05-28 20:52:06,861 - mmdet - INFO - Epoch [6][6500/7330] lr: 1.000e-04, eta: 11:36:59, time: 0.968, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0456, loss_cls: 0.1863, acc: 93.2048, loss_bbox: 0.2347, loss_mask: 0.2324, loss: 0.7184 2024-05-28 20:52:53,125 - mmdet - INFO - Epoch [6][6550/7330] lr: 1.000e-04, eta: 11:36:12, time: 0.925, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0412, loss_cls: 0.1722, acc: 93.4788, loss_bbox: 0.2285, loss_mask: 0.2326, loss: 0.6919 2024-05-28 20:53:39,467 - mmdet - INFO - Epoch [6][6600/7330] lr: 1.000e-04, eta: 11:35:25, time: 0.927, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0494, loss_cls: 0.1861, acc: 93.1030, loss_bbox: 0.2395, loss_mask: 0.2386, loss: 0.7331 2024-05-28 20:54:25,740 - mmdet - INFO - Epoch [6][6650/7330] lr: 1.000e-04, eta: 11:34:38, time: 0.925, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0470, loss_cls: 0.1857, acc: 93.1663, loss_bbox: 0.2337, loss_mask: 0.2372, loss: 0.7218 2024-05-28 20:55:14,013 - mmdet - INFO - Epoch [6][6700/7330] lr: 1.000e-04, eta: 11:33:53, time: 0.965, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0413, loss_cls: 0.1776, acc: 93.4165, loss_bbox: 0.2221, loss_mask: 0.2247, loss: 0.6832 2024-05-28 20:56:02,361 - mmdet - INFO - Epoch [6][6750/7330] lr: 1.000e-04, eta: 11:33:08, time: 0.967, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0433, loss_cls: 0.1870, acc: 93.0618, loss_bbox: 0.2345, loss_mask: 0.2377, loss: 0.7209 2024-05-28 20:56:48,559 - mmdet - INFO - Epoch [6][6800/7330] lr: 1.000e-04, eta: 11:32:21, time: 0.924, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0461, loss_cls: 0.1858, acc: 93.3176, loss_bbox: 0.2303, loss_mask: 0.2283, loss: 0.7097 2024-05-28 20:57:34,105 - mmdet - INFO - Epoch [6][6850/7330] lr: 1.000e-04, eta: 11:31:33, time: 0.911, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0432, loss_cls: 0.1830, acc: 93.2854, loss_bbox: 0.2303, loss_mask: 0.2371, loss: 0.7121 2024-05-28 20:58:20,324 - mmdet - INFO - Epoch [6][6900/7330] lr: 1.000e-04, eta: 11:30:46, time: 0.924, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0443, loss_cls: 0.1831, acc: 93.3054, loss_bbox: 0.2230, loss_mask: 0.2326, loss: 0.7019 2024-05-28 20:59:10,225 - mmdet - INFO - Epoch [6][6950/7330] lr: 1.000e-04, eta: 11:30:02, time: 0.998, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0438, loss_cls: 0.1829, acc: 93.2156, loss_bbox: 0.2312, loss_mask: 0.2318, loss: 0.7075 2024-05-28 20:59:56,019 - mmdet - INFO - Epoch [6][7000/7330] lr: 1.000e-04, eta: 11:29:15, time: 0.916, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0427, loss_cls: 0.1853, acc: 93.1360, loss_bbox: 0.2350, loss_mask: 0.2407, loss: 0.7227 2024-05-28 21:00:44,264 - mmdet - INFO - Epoch [6][7050/7330] lr: 1.000e-04, eta: 11:28:30, time: 0.965, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0435, loss_cls: 0.1821, acc: 93.3105, loss_bbox: 0.2251, loss_mask: 0.2356, loss: 0.7036 2024-05-28 21:01:30,979 - mmdet - INFO - Epoch [6][7100/7330] lr: 1.000e-04, eta: 11:27:43, time: 0.934, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0467, loss_cls: 0.1914, acc: 93.0095, loss_bbox: 0.2397, loss_mask: 0.2380, loss: 0.7347 2024-05-28 21:02:17,110 - mmdet - INFO - Epoch [6][7150/7330] lr: 1.000e-04, eta: 11:26:56, time: 0.923, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0452, loss_cls: 0.1948, acc: 92.8892, loss_bbox: 0.2381, loss_mask: 0.2307, loss: 0.7267 2024-05-28 21:03:03,148 - mmdet - INFO - Epoch [6][7200/7330] lr: 1.000e-04, eta: 11:26:09, time: 0.921, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0455, loss_cls: 0.1805, acc: 93.2649, loss_bbox: 0.2306, loss_mask: 0.2329, loss: 0.7093 2024-05-28 21:03:48,665 - mmdet - INFO - Epoch [6][7250/7330] lr: 1.000e-04, eta: 11:25:21, time: 0.910, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0435, loss_cls: 0.1771, acc: 93.4590, loss_bbox: 0.2243, loss_mask: 0.2330, loss: 0.6948 2024-05-28 21:04:34,736 - mmdet - INFO - Epoch [6][7300/7330] lr: 1.000e-04, eta: 11:24:33, time: 0.921, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0448, loss_cls: 0.1904, acc: 93.0730, loss_bbox: 0.2378, loss_mask: 0.2389, loss: 0.7304 2024-05-28 21:05:03,169 - mmdet - INFO - Saving checkpoint at 6 epochs 2024-05-28 21:07:15,226 - mmdet - INFO - Evaluating bbox... 2024-05-28 21:07:40,463 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.465 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.706 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.514 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.294 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.511 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.625 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.590 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.590 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.590 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.403 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.637 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.742 2024-05-28 21:07:40,464 - mmdet - INFO - Evaluating segm... 2024-05-28 21:08:05,653 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.420 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.670 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.450 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.207 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.459 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.637 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.535 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.535 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.535 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.330 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.584 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.719 2024-05-28 21:08:06,016 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 21:08:06,018 - mmdet - INFO - Epoch(val) [6][625] bbox_mAP: 0.4650, bbox_mAP_50: 0.7060, bbox_mAP_75: 0.5140, bbox_mAP_s: 0.2940, bbox_mAP_m: 0.5110, bbox_mAP_l: 0.6250, bbox_mAP_copypaste: 0.465 0.706 0.514 0.294 0.511 0.625, segm_mAP: 0.4200, segm_mAP_50: 0.6700, segm_mAP_75: 0.4500, segm_mAP_s: 0.2070, segm_mAP_m: 0.4590, segm_mAP_l: 0.6370, segm_mAP_copypaste: 0.420 0.670 0.450 0.207 0.459 0.637 2024-05-28 21:09:00,877 - mmdet - INFO - Epoch [7][50/7330] lr: 1.000e-04, eta: 11:22:59, time: 1.097, data_time: 0.112, memory: 20925, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0422, loss_cls: 0.1700, acc: 93.7769, loss_bbox: 0.2225, loss_mask: 0.2275, loss: 0.6785 2024-05-28 21:09:47,108 - mmdet - INFO - Epoch [7][100/7330] lr: 1.000e-04, eta: 11:22:12, time: 0.925, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0427, loss_cls: 0.1791, acc: 93.3281, loss_bbox: 0.2289, loss_mask: 0.2292, loss: 0.6961 2024-05-28 21:10:35,505 - mmdet - INFO - Epoch [7][150/7330] lr: 1.000e-04, eta: 11:21:27, time: 0.968, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0408, loss_cls: 0.1745, acc: 93.4680, loss_bbox: 0.2262, loss_mask: 0.2294, loss: 0.6867 2024-05-28 21:11:21,366 - mmdet - INFO - Epoch [7][200/7330] lr: 1.000e-04, eta: 11:20:40, time: 0.917, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0436, loss_cls: 0.1791, acc: 93.3838, loss_bbox: 0.2281, loss_mask: 0.2313, loss: 0.7001 2024-05-28 21:12:07,225 - mmdet - INFO - Epoch [7][250/7330] lr: 1.000e-04, eta: 11:19:52, time: 0.917, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0426, loss_cls: 0.1730, acc: 93.5522, loss_bbox: 0.2218, loss_mask: 0.2254, loss: 0.6786 2024-05-28 21:12:53,678 - mmdet - INFO - Epoch [7][300/7330] lr: 1.000e-04, eta: 11:19:05, time: 0.929, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0422, loss_cls: 0.1766, acc: 93.4492, loss_bbox: 0.2255, loss_mask: 0.2255, loss: 0.6864 2024-05-28 21:13:40,377 - mmdet - INFO - Epoch [7][350/7330] lr: 1.000e-04, eta: 11:18:19, time: 0.934, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0436, loss_cls: 0.1803, acc: 93.3340, loss_bbox: 0.2287, loss_mask: 0.2360, loss: 0.7044 2024-05-28 21:14:26,847 - mmdet - INFO - Epoch [7][400/7330] lr: 1.000e-04, eta: 11:17:32, time: 0.929, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0441, loss_cls: 0.1823, acc: 93.3284, loss_bbox: 0.2289, loss_mask: 0.2312, loss: 0.7035 2024-05-28 21:15:17,870 - mmdet - INFO - Epoch [7][450/7330] lr: 1.000e-04, eta: 11:16:50, time: 1.020, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0452, loss_cls: 0.1821, acc: 93.1292, loss_bbox: 0.2361, loss_mask: 0.2303, loss: 0.7118 2024-05-28 21:16:04,624 - mmdet - INFO - Epoch [7][500/7330] lr: 1.000e-04, eta: 11:16:03, time: 0.935, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0465, loss_cls: 0.1835, acc: 93.1416, loss_bbox: 0.2334, loss_mask: 0.2313, loss: 0.7116 2024-05-28 21:16:50,562 - mmdet - INFO - Epoch [7][550/7330] lr: 1.000e-04, eta: 11:15:16, time: 0.919, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0419, loss_cls: 0.1647, acc: 93.9014, loss_bbox: 0.2166, loss_mask: 0.2231, loss: 0.6614 2024-05-28 21:17:36,233 - mmdet - INFO - Epoch [7][600/7330] lr: 1.000e-04, eta: 11:14:28, time: 0.913, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0416, loss_cls: 0.1745, acc: 93.4895, loss_bbox: 0.2291, loss_mask: 0.2373, loss: 0.6986 2024-05-28 21:18:22,121 - mmdet - INFO - Epoch [7][650/7330] lr: 1.000e-04, eta: 11:13:41, time: 0.918, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0412, loss_cls: 0.1748, acc: 93.5439, loss_bbox: 0.2253, loss_mask: 0.2322, loss: 0.6909 2024-05-28 21:19:08,160 - mmdet - INFO - Epoch [7][700/7330] lr: 1.000e-04, eta: 11:12:54, time: 0.921, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0438, loss_cls: 0.1809, acc: 93.2927, loss_bbox: 0.2316, loss_mask: 0.2322, loss: 0.7059 2024-05-28 21:19:54,392 - mmdet - INFO - Epoch [7][750/7330] lr: 1.000e-04, eta: 11:12:07, time: 0.925, data_time: 0.035, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0456, loss_cls: 0.1766, acc: 93.5042, loss_bbox: 0.2293, loss_mask: 0.2331, loss: 0.7007 2024-05-28 21:20:39,966 - mmdet - INFO - Epoch [7][800/7330] lr: 1.000e-04, eta: 11:11:19, time: 0.911, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0419, loss_cls: 0.1765, acc: 93.4614, loss_bbox: 0.2237, loss_mask: 0.2248, loss: 0.6843 2024-05-28 21:21:25,867 - mmdet - INFO - Epoch [7][850/7330] lr: 1.000e-04, eta: 11:10:31, time: 0.918, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0440, loss_cls: 0.1790, acc: 93.4395, loss_bbox: 0.2314, loss_mask: 0.2297, loss: 0.7025 2024-05-28 21:22:14,952 - mmdet - INFO - Epoch [7][900/7330] lr: 1.000e-04, eta: 11:09:47, time: 0.982, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0436, loss_cls: 0.1762, acc: 93.4915, loss_bbox: 0.2283, loss_mask: 0.2313, loss: 0.6958 2024-05-28 21:23:03,193 - mmdet - INFO - Epoch [7][950/7330] lr: 1.000e-04, eta: 11:09:02, time: 0.965, data_time: 0.035, memory: 20925, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0417, loss_cls: 0.1705, acc: 93.7126, loss_bbox: 0.2260, loss_mask: 0.2294, loss: 0.6833 2024-05-28 21:23:49,062 - mmdet - INFO - Epoch [7][1000/7330] lr: 1.000e-04, eta: 11:08:15, time: 0.917, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0455, loss_cls: 0.1839, acc: 93.1699, loss_bbox: 0.2403, loss_mask: 0.2343, loss: 0.7214 2024-05-28 21:24:37,832 - mmdet - INFO - Epoch [7][1050/7330] lr: 1.000e-04, eta: 11:07:30, time: 0.975, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0433, loss_cls: 0.1800, acc: 93.2356, loss_bbox: 0.2322, loss_mask: 0.2333, loss: 0.7060 2024-05-28 21:25:24,183 - mmdet - INFO - Epoch [7][1100/7330] lr: 1.000e-04, eta: 11:06:43, time: 0.927, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0427, loss_cls: 0.1736, acc: 93.5862, loss_bbox: 0.2229, loss_mask: 0.2308, loss: 0.6870 2024-05-28 21:26:10,212 - mmdet - INFO - Epoch [7][1150/7330] lr: 1.000e-04, eta: 11:05:56, time: 0.921, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0418, loss_cls: 0.1756, acc: 93.5225, loss_bbox: 0.2234, loss_mask: 0.2260, loss: 0.6837 2024-05-28 21:26:58,691 - mmdet - INFO - Epoch [7][1200/7330] lr: 1.000e-04, eta: 11:05:11, time: 0.970, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0417, loss_cls: 0.1725, acc: 93.5979, loss_bbox: 0.2226, loss_mask: 0.2268, loss: 0.6809 2024-05-28 21:27:46,814 - mmdet - INFO - Epoch [7][1250/7330] lr: 1.000e-04, eta: 11:04:26, time: 0.962, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0424, loss_cls: 0.1787, acc: 93.2551, loss_bbox: 0.2272, loss_mask: 0.2293, loss: 0.6945 2024-05-28 21:28:32,597 - mmdet - INFO - Epoch [7][1300/7330] lr: 1.000e-04, eta: 11:03:38, time: 0.916, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0430, loss_cls: 0.1823, acc: 93.2517, loss_bbox: 0.2289, loss_mask: 0.2273, loss: 0.6994 2024-05-28 21:29:18,454 - mmdet - INFO - Epoch [7][1350/7330] lr: 1.000e-04, eta: 11:02:51, time: 0.917, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0407, loss_cls: 0.1696, acc: 93.6660, loss_bbox: 0.2199, loss_mask: 0.2290, loss: 0.6739 2024-05-28 21:30:04,158 - mmdet - INFO - Epoch [7][1400/7330] lr: 1.000e-04, eta: 11:02:03, time: 0.914, data_time: 0.036, memory: 20925, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0433, loss_cls: 0.1736, acc: 93.5889, loss_bbox: 0.2254, loss_mask: 0.2309, loss: 0.6898 2024-05-28 21:30:54,490 - mmdet - INFO - Epoch [7][1450/7330] lr: 1.000e-04, eta: 11:01:20, time: 1.007, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0432, loss_cls: 0.1825, acc: 93.1470, loss_bbox: 0.2349, loss_mask: 0.2254, loss: 0.7033 2024-05-28 21:31:45,090 - mmdet - INFO - Epoch [7][1500/7330] lr: 1.000e-04, eta: 11:00:37, time: 1.012, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0385, loss_cls: 0.1696, acc: 93.7756, loss_bbox: 0.2142, loss_mask: 0.2220, loss: 0.6593 2024-05-28 21:32:30,922 - mmdet - INFO - Epoch [7][1550/7330] lr: 1.000e-04, eta: 10:59:50, time: 0.917, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0439, loss_cls: 0.1791, acc: 93.3892, loss_bbox: 0.2290, loss_mask: 0.2318, loss: 0.7015 2024-05-28 21:33:17,063 - mmdet - INFO - Epoch [7][1600/7330] lr: 1.000e-04, eta: 10:59:03, time: 0.923, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0428, loss_cls: 0.1801, acc: 93.3015, loss_bbox: 0.2313, loss_mask: 0.2337, loss: 0.7036 2024-05-28 21:34:02,618 - mmdet - INFO - Epoch [7][1650/7330] lr: 1.000e-04, eta: 10:58:15, time: 0.911, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0428, loss_cls: 0.1843, acc: 93.2893, loss_bbox: 0.2340, loss_mask: 0.2330, loss: 0.7128 2024-05-28 21:34:48,062 - mmdet - INFO - Epoch [7][1700/7330] lr: 1.000e-04, eta: 10:57:27, time: 0.908, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0423, loss_cls: 0.1684, acc: 93.7241, loss_bbox: 0.2184, loss_mask: 0.2252, loss: 0.6691 2024-05-28 21:35:33,976 - mmdet - INFO - Epoch [7][1750/7330] lr: 1.000e-04, eta: 10:56:40, time: 0.919, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0429, loss_cls: 0.1764, acc: 93.5642, loss_bbox: 0.2238, loss_mask: 0.2295, loss: 0.6909 2024-05-28 21:36:19,914 - mmdet - INFO - Epoch [7][1800/7330] lr: 1.000e-04, eta: 10:55:53, time: 0.919, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0444, loss_cls: 0.1783, acc: 93.2434, loss_bbox: 0.2286, loss_mask: 0.2327, loss: 0.7010 2024-05-28 21:37:05,782 - mmdet - INFO - Epoch [7][1850/7330] lr: 1.000e-04, eta: 10:55:05, time: 0.917, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0395, loss_cls: 0.1726, acc: 93.6245, loss_bbox: 0.2165, loss_mask: 0.2232, loss: 0.6676 2024-05-28 21:37:51,698 - mmdet - INFO - Epoch [7][1900/7330] lr: 1.000e-04, eta: 10:54:18, time: 0.918, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0420, loss_cls: 0.1780, acc: 93.4038, loss_bbox: 0.2262, loss_mask: 0.2296, loss: 0.6918 2024-05-28 21:38:42,130 - mmdet - INFO - Epoch [7][1950/7330] lr: 1.000e-04, eta: 10:53:35, time: 1.009, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0447, loss_cls: 0.1806, acc: 93.3315, loss_bbox: 0.2279, loss_mask: 0.2288, loss: 0.6998 2024-05-28 21:39:28,504 - mmdet - INFO - Epoch [7][2000/7330] lr: 1.000e-04, eta: 10:52:48, time: 0.928, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0423, loss_cls: 0.1724, acc: 93.5759, loss_bbox: 0.2249, loss_mask: 0.2239, loss: 0.6808 2024-05-28 21:40:19,318 - mmdet - INFO - Epoch [7][2050/7330] lr: 1.000e-04, eta: 10:52:05, time: 1.016, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0455, loss_cls: 0.1788, acc: 93.3425, loss_bbox: 0.2336, loss_mask: 0.2299, loss: 0.7033 2024-05-28 21:41:08,034 - mmdet - INFO - Epoch [7][2100/7330] lr: 1.000e-04, eta: 10:51:20, time: 0.974, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0428, loss_cls: 0.1735, acc: 93.5486, loss_bbox: 0.2264, loss_mask: 0.2280, loss: 0.6860 2024-05-28 21:41:54,781 - mmdet - INFO - Epoch [7][2150/7330] lr: 1.000e-04, eta: 10:50:34, time: 0.935, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0450, loss_cls: 0.1731, acc: 93.6152, loss_bbox: 0.2247, loss_mask: 0.2240, loss: 0.6842 2024-05-28 21:42:41,941 - mmdet - INFO - Epoch [7][2200/7330] lr: 1.000e-04, eta: 10:49:47, time: 0.943, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0452, loss_cls: 0.1790, acc: 93.3857, loss_bbox: 0.2279, loss_mask: 0.2271, loss: 0.6973 2024-05-28 21:43:27,731 - mmdet - INFO - Epoch [7][2250/7330] lr: 1.000e-04, eta: 10:49:00, time: 0.916, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0467, loss_cls: 0.1821, acc: 93.2234, loss_bbox: 0.2351, loss_mask: 0.2318, loss: 0.7124 2024-05-28 21:44:18,387 - mmdet - INFO - Epoch [7][2300/7330] lr: 1.000e-04, eta: 10:48:17, time: 1.013, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0441, loss_cls: 0.1831, acc: 93.1963, loss_bbox: 0.2369, loss_mask: 0.2335, loss: 0.7132 2024-05-28 21:45:04,595 - mmdet - INFO - Epoch [7][2350/7330] lr: 1.000e-04, eta: 10:47:30, time: 0.924, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0432, loss_cls: 0.1868, acc: 93.2073, loss_bbox: 0.2354, loss_mask: 0.2308, loss: 0.7144 2024-05-28 21:45:50,387 - mmdet - INFO - Epoch [7][2400/7330] lr: 1.000e-04, eta: 10:46:42, time: 0.916, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0400, loss_cls: 0.1692, acc: 93.5569, loss_bbox: 0.2172, loss_mask: 0.2257, loss: 0.6674 2024-05-28 21:46:36,172 - mmdet - INFO - Epoch [7][2450/7330] lr: 1.000e-04, eta: 10:45:55, time: 0.916, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0437, loss_cls: 0.1737, acc: 93.5347, loss_bbox: 0.2253, loss_mask: 0.2306, loss: 0.6904 2024-05-28 21:47:22,367 - mmdet - INFO - Epoch [7][2500/7330] lr: 1.000e-04, eta: 10:45:08, time: 0.923, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0422, loss_cls: 0.1785, acc: 93.3857, loss_bbox: 0.2303, loss_mask: 0.2310, loss: 0.6979 2024-05-28 21:48:14,295 - mmdet - INFO - Epoch [7][2550/7330] lr: 1.000e-04, eta: 10:44:26, time: 1.039, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0460, loss_cls: 0.1890, acc: 92.9641, loss_bbox: 0.2378, loss_mask: 0.2321, loss: 0.7227 2024-05-28 21:49:02,363 - mmdet - INFO - Epoch [7][2600/7330] lr: 1.000e-04, eta: 10:43:40, time: 0.961, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0428, loss_cls: 0.1695, acc: 93.7373, loss_bbox: 0.2163, loss_mask: 0.2274, loss: 0.6706 2024-05-28 21:49:49,719 - mmdet - INFO - Epoch [7][2650/7330] lr: 1.000e-04, eta: 10:42:54, time: 0.947, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0457, loss_cls: 0.1791, acc: 93.3152, loss_bbox: 0.2316, loss_mask: 0.2318, loss: 0.7056 2024-05-28 21:50:34,797 - mmdet - INFO - Epoch [7][2700/7330] lr: 1.000e-04, eta: 10:42:06, time: 0.902, data_time: 0.037, memory: 20925, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0397, loss_cls: 0.1629, acc: 94.0178, loss_bbox: 0.2118, loss_mask: 0.2261, loss: 0.6561 2024-05-28 21:51:20,507 - mmdet - INFO - Epoch [7][2750/7330] lr: 1.000e-04, eta: 10:41:19, time: 0.914, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0443, loss_cls: 0.1751, acc: 93.4236, loss_bbox: 0.2274, loss_mask: 0.2369, loss: 0.6994 2024-05-28 21:52:06,391 - mmdet - INFO - Epoch [7][2800/7330] lr: 1.000e-04, eta: 10:40:31, time: 0.918, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0437, loss_cls: 0.1765, acc: 93.5303, loss_bbox: 0.2208, loss_mask: 0.2332, loss: 0.6922 2024-05-28 21:52:51,838 - mmdet - INFO - Epoch [7][2850/7330] lr: 1.000e-04, eta: 10:39:44, time: 0.909, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0419, loss_cls: 0.1754, acc: 93.4883, loss_bbox: 0.2257, loss_mask: 0.2281, loss: 0.6867 2024-05-28 21:53:37,212 - mmdet - INFO - Epoch [7][2900/7330] lr: 1.000e-04, eta: 10:38:56, time: 0.907, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0422, loss_cls: 0.1724, acc: 93.6580, loss_bbox: 0.2244, loss_mask: 0.2316, loss: 0.6885 2024-05-28 21:54:23,543 - mmdet - INFO - Epoch [7][2950/7330] lr: 1.000e-04, eta: 10:38:09, time: 0.927, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0444, loss_cls: 0.1831, acc: 93.1729, loss_bbox: 0.2307, loss_mask: 0.2313, loss: 0.7065 2024-05-28 21:55:12,229 - mmdet - INFO - Epoch [7][3000/7330] lr: 1.000e-04, eta: 10:37:24, time: 0.974, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0453, loss_cls: 0.1820, acc: 93.2078, loss_bbox: 0.2315, loss_mask: 0.2293, loss: 0.7046 2024-05-28 21:55:57,965 - mmdet - INFO - Epoch [7][3050/7330] lr: 1.000e-04, eta: 10:36:37, time: 0.915, data_time: 0.065, memory: 20925, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0431, loss_cls: 0.1831, acc: 93.1582, loss_bbox: 0.2312, loss_mask: 0.2272, loss: 0.7028 2024-05-28 21:56:46,427 - mmdet - INFO - Epoch [7][3100/7330] lr: 1.000e-04, eta: 10:35:52, time: 0.969, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0405, loss_cls: 0.1799, acc: 93.2715, loss_bbox: 0.2297, loss_mask: 0.2316, loss: 0.6979 2024-05-28 21:57:31,765 - mmdet - INFO - Epoch [7][3150/7330] lr: 1.000e-04, eta: 10:35:04, time: 0.907, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0407, loss_cls: 0.1712, acc: 93.7268, loss_bbox: 0.2214, loss_mask: 0.2289, loss: 0.6799 2024-05-28 21:58:19,099 - mmdet - INFO - Epoch [7][3200/7330] lr: 1.000e-04, eta: 10:34:18, time: 0.947, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0421, loss_cls: 0.1778, acc: 93.3621, loss_bbox: 0.2287, loss_mask: 0.2298, loss: 0.6947 2024-05-28 21:59:05,630 - mmdet - INFO - Epoch [7][3250/7330] lr: 1.000e-04, eta: 10:33:31, time: 0.931, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0461, loss_cls: 0.1844, acc: 93.2161, loss_bbox: 0.2328, loss_mask: 0.2369, loss: 0.7186 2024-05-28 21:59:51,457 - mmdet - INFO - Epoch [7][3300/7330] lr: 1.000e-04, eta: 10:32:43, time: 0.917, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0424, loss_cls: 0.1812, acc: 93.3840, loss_bbox: 0.2283, loss_mask: 0.2317, loss: 0.7012 2024-05-28 22:00:41,860 - mmdet - INFO - Epoch [7][3350/7330] lr: 1.000e-04, eta: 10:32:00, time: 1.008, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0410, loss_cls: 0.1763, acc: 93.5217, loss_bbox: 0.2247, loss_mask: 0.2321, loss: 0.6907 2024-05-28 22:01:28,100 - mmdet - INFO - Epoch [7][3400/7330] lr: 1.000e-04, eta: 10:31:13, time: 0.925, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0454, loss_cls: 0.1835, acc: 93.2329, loss_bbox: 0.2307, loss_mask: 0.2356, loss: 0.7122 2024-05-28 22:02:14,036 - mmdet - INFO - Epoch [7][3450/7330] lr: 1.000e-04, eta: 10:30:26, time: 0.919, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0440, loss_cls: 0.1788, acc: 93.2844, loss_bbox: 0.2325, loss_mask: 0.2315, loss: 0.7031 2024-05-28 22:03:00,381 - mmdet - INFO - Epoch [7][3500/7330] lr: 1.000e-04, eta: 10:29:39, time: 0.927, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0445, loss_cls: 0.1861, acc: 93.1216, loss_bbox: 0.2323, loss_mask: 0.2316, loss: 0.7126 2024-05-28 22:03:46,385 - mmdet - INFO - Epoch [7][3550/7330] lr: 1.000e-04, eta: 10:28:52, time: 0.920, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0441, loss_cls: 0.1744, acc: 93.5266, loss_bbox: 0.2261, loss_mask: 0.2269, loss: 0.6872 2024-05-28 22:04:34,571 - mmdet - INFO - Epoch [7][3600/7330] lr: 1.000e-04, eta: 10:28:06, time: 0.964, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0442, loss_cls: 0.1822, acc: 93.2773, loss_bbox: 0.2314, loss_mask: 0.2313, loss: 0.7074 2024-05-28 22:05:25,187 - mmdet - INFO - Epoch [7][3650/7330] lr: 1.000e-04, eta: 10:27:23, time: 1.012, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0406, loss_cls: 0.1794, acc: 93.3647, loss_bbox: 0.2262, loss_mask: 0.2280, loss: 0.6921 2024-05-28 22:06:11,299 - mmdet - INFO - Epoch [7][3700/7330] lr: 1.000e-04, eta: 10:26:36, time: 0.922, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0423, loss_cls: 0.1773, acc: 93.4092, loss_bbox: 0.2323, loss_mask: 0.2259, loss: 0.6951 2024-05-28 22:06:57,833 - mmdet - INFO - Epoch [7][3750/7330] lr: 1.000e-04, eta: 10:25:49, time: 0.931, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0448, loss_cls: 0.1869, acc: 93.1201, loss_bbox: 0.2349, loss_mask: 0.2340, loss: 0.7173 2024-05-28 22:07:43,110 - mmdet - INFO - Epoch [7][3800/7330] lr: 1.000e-04, eta: 10:25:01, time: 0.906, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0378, loss_cls: 0.1728, acc: 93.6519, loss_bbox: 0.2205, loss_mask: 0.2297, loss: 0.6748 2024-05-28 22:08:29,859 - mmdet - INFO - Epoch [7][3850/7330] lr: 1.000e-04, eta: 10:24:14, time: 0.935, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0452, loss_cls: 0.1751, acc: 93.5391, loss_bbox: 0.2246, loss_mask: 0.2258, loss: 0.6874 2024-05-28 22:09:16,064 - mmdet - INFO - Epoch [7][3900/7330] lr: 1.000e-04, eta: 10:23:27, time: 0.924, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0454, loss_cls: 0.1873, acc: 93.0962, loss_bbox: 0.2348, loss_mask: 0.2327, loss: 0.7169 2024-05-28 22:10:02,315 - mmdet - INFO - Epoch [7][3950/7330] lr: 1.000e-04, eta: 10:22:40, time: 0.925, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0429, loss_cls: 0.1795, acc: 93.2947, loss_bbox: 0.2245, loss_mask: 0.2335, loss: 0.6962 2024-05-28 22:10:48,934 - mmdet - INFO - Epoch [7][4000/7330] lr: 1.000e-04, eta: 10:21:54, time: 0.932, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0449, loss_cls: 0.1792, acc: 93.4302, loss_bbox: 0.2262, loss_mask: 0.2280, loss: 0.6944 2024-05-28 22:11:37,136 - mmdet - INFO - Epoch [7][4050/7330] lr: 1.000e-04, eta: 10:21:08, time: 0.964, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0435, loss_cls: 0.1736, acc: 93.5466, loss_bbox: 0.2225, loss_mask: 0.2318, loss: 0.6870 2024-05-28 22:12:22,985 - mmdet - INFO - Epoch [7][4100/7330] lr: 1.000e-04, eta: 10:20:21, time: 0.917, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0449, loss_cls: 0.1888, acc: 93.0012, loss_bbox: 0.2372, loss_mask: 0.2314, loss: 0.7190 2024-05-28 22:13:10,523 - mmdet - INFO - Epoch [7][4150/7330] lr: 1.000e-04, eta: 10:19:35, time: 0.951, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0428, loss_cls: 0.1799, acc: 93.3469, loss_bbox: 0.2292, loss_mask: 0.2305, loss: 0.6989 2024-05-28 22:13:56,557 - mmdet - INFO - Epoch [7][4200/7330] lr: 1.000e-04, eta: 10:18:48, time: 0.921, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0442, loss_cls: 0.1792, acc: 93.3289, loss_bbox: 0.2288, loss_mask: 0.2323, loss: 0.7014 2024-05-28 22:14:44,715 - mmdet - INFO - Epoch [7][4250/7330] lr: 1.000e-04, eta: 10:18:02, time: 0.963, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0422, loss_cls: 0.1739, acc: 93.5928, loss_bbox: 0.2169, loss_mask: 0.2267, loss: 0.6767 2024-05-28 22:15:30,640 - mmdet - INFO - Epoch [7][4300/7330] lr: 1.000e-04, eta: 10:17:15, time: 0.919, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0420, loss_cls: 0.1816, acc: 93.3184, loss_bbox: 0.2277, loss_mask: 0.2338, loss: 0.7020 2024-05-28 22:16:16,234 - mmdet - INFO - Epoch [7][4350/7330] lr: 1.000e-04, eta: 10:16:28, time: 0.912, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0434, loss_cls: 0.1733, acc: 93.5647, loss_bbox: 0.2230, loss_mask: 0.2271, loss: 0.6829 2024-05-28 22:17:04,560 - mmdet - INFO - Epoch [7][4400/7330] lr: 1.000e-04, eta: 10:15:42, time: 0.966, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0397, loss_cls: 0.1673, acc: 93.7163, loss_bbox: 0.2174, loss_mask: 0.2281, loss: 0.6662 2024-05-28 22:17:52,283 - mmdet - INFO - Epoch [7][4450/7330] lr: 1.000e-04, eta: 10:14:56, time: 0.954, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0423, loss_cls: 0.1819, acc: 93.3147, loss_bbox: 0.2297, loss_mask: 0.2344, loss: 0.7053 2024-05-28 22:18:38,804 - mmdet - INFO - Epoch [7][4500/7330] lr: 1.000e-04, eta: 10:14:10, time: 0.930, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0417, loss_cls: 0.1759, acc: 93.4473, loss_bbox: 0.2259, loss_mask: 0.2285, loss: 0.6877 2024-05-28 22:19:25,215 - mmdet - INFO - Epoch [7][4550/7330] lr: 1.000e-04, eta: 10:13:23, time: 0.928, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0431, loss_cls: 0.1796, acc: 93.3313, loss_bbox: 0.2294, loss_mask: 0.2353, loss: 0.7042 2024-05-28 22:20:11,320 - mmdet - INFO - Epoch [7][4600/7330] lr: 1.000e-04, eta: 10:12:36, time: 0.922, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0423, loss_cls: 0.1695, acc: 93.7148, loss_bbox: 0.2185, loss_mask: 0.2281, loss: 0.6749 2024-05-28 22:20:59,705 - mmdet - INFO - Epoch [7][4650/7330] lr: 1.000e-04, eta: 10:11:50, time: 0.968, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0456, loss_cls: 0.1837, acc: 93.1704, loss_bbox: 0.2354, loss_mask: 0.2305, loss: 0.7131 2024-05-28 22:21:50,270 - mmdet - INFO - Epoch [7][4700/7330] lr: 1.000e-04, eta: 10:11:07, time: 1.011, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0429, loss_cls: 0.1776, acc: 93.5518, loss_bbox: 0.2288, loss_mask: 0.2284, loss: 0.6941 2024-05-28 22:22:36,058 - mmdet - INFO - Epoch [7][4750/7330] lr: 1.000e-04, eta: 10:10:19, time: 0.916, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0424, loss_cls: 0.1711, acc: 93.7224, loss_bbox: 0.2166, loss_mask: 0.2249, loss: 0.6715 2024-05-28 22:23:21,840 - mmdet - INFO - Epoch [7][4800/7330] lr: 1.000e-04, eta: 10:09:32, time: 0.916, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0410, loss_cls: 0.1755, acc: 93.4934, loss_bbox: 0.2219, loss_mask: 0.2292, loss: 0.6847 2024-05-28 22:24:07,759 - mmdet - INFO - Epoch [7][4850/7330] lr: 1.000e-04, eta: 10:08:45, time: 0.918, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0441, loss_cls: 0.1821, acc: 93.2290, loss_bbox: 0.2324, loss_mask: 0.2314, loss: 0.7081 2024-05-28 22:24:53,261 - mmdet - INFO - Epoch [7][4900/7330] lr: 1.000e-04, eta: 10:07:57, time: 0.910, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0386, loss_cls: 0.1662, acc: 93.8293, loss_bbox: 0.2193, loss_mask: 0.2246, loss: 0.6647 2024-05-28 22:25:38,932 - mmdet - INFO - Epoch [7][4950/7330] lr: 1.000e-04, eta: 10:07:10, time: 0.913, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0439, loss_cls: 0.1841, acc: 93.1531, loss_bbox: 0.2367, loss_mask: 0.2342, loss: 0.7172 2024-05-28 22:26:24,503 - mmdet - INFO - Epoch [7][5000/7330] lr: 1.000e-04, eta: 10:06:22, time: 0.911, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0411, loss_cls: 0.1753, acc: 93.4871, loss_bbox: 0.2221, loss_mask: 0.2328, loss: 0.6870 2024-05-28 22:27:10,324 - mmdet - INFO - Epoch [7][5050/7330] lr: 1.000e-04, eta: 10:05:35, time: 0.916, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0452, loss_cls: 0.1831, acc: 93.1709, loss_bbox: 0.2363, loss_mask: 0.2299, loss: 0.7105 2024-05-28 22:27:58,581 - mmdet - INFO - Epoch [7][5100/7330] lr: 1.000e-04, eta: 10:04:49, time: 0.965, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0476, loss_cls: 0.1824, acc: 93.2354, loss_bbox: 0.2375, loss_mask: 0.2363, loss: 0.7219 2024-05-28 22:28:43,729 - mmdet - INFO - Epoch [7][5150/7330] lr: 1.000e-04, eta: 10:04:02, time: 0.903, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0398, loss_cls: 0.1766, acc: 93.5215, loss_bbox: 0.2235, loss_mask: 0.2278, loss: 0.6836 2024-05-28 22:29:31,201 - mmdet - INFO - Epoch [7][5200/7330] lr: 1.000e-04, eta: 10:03:16, time: 0.949, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0429, loss_cls: 0.1720, acc: 93.5332, loss_bbox: 0.2207, loss_mask: 0.2283, loss: 0.6789 2024-05-28 22:30:16,897 - mmdet - INFO - Epoch [7][5250/7330] lr: 1.000e-04, eta: 10:02:28, time: 0.914, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0425, loss_cls: 0.1824, acc: 93.2378, loss_bbox: 0.2293, loss_mask: 0.2321, loss: 0.7023 2024-05-28 22:31:05,825 - mmdet - INFO - Epoch [7][5300/7330] lr: 1.000e-04, eta: 10:01:43, time: 0.978, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0416, loss_cls: 0.1751, acc: 93.4402, loss_bbox: 0.2328, loss_mask: 0.2331, loss: 0.6999 2024-05-28 22:31:51,577 - mmdet - INFO - Epoch [7][5350/7330] lr: 1.000e-04, eta: 10:00:56, time: 0.915, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0412, loss_cls: 0.1767, acc: 93.4746, loss_bbox: 0.2268, loss_mask: 0.2296, loss: 0.6902 2024-05-28 22:32:37,898 - mmdet - INFO - Epoch [7][5400/7330] lr: 1.000e-04, eta: 10:00:09, time: 0.926, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0432, loss_cls: 0.1795, acc: 93.3977, loss_bbox: 0.2298, loss_mask: 0.2308, loss: 0.7008 2024-05-28 22:33:26,926 - mmdet - INFO - Epoch [7][5450/7330] lr: 1.000e-04, eta: 9:59:24, time: 0.981, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0433, loss_cls: 0.1804, acc: 93.2678, loss_bbox: 0.2276, loss_mask: 0.2327, loss: 0.7015 2024-05-28 22:34:15,780 - mmdet - INFO - Epoch [7][5500/7330] lr: 1.000e-04, eta: 9:58:39, time: 0.977, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0420, loss_cls: 0.1779, acc: 93.4380, loss_bbox: 0.2297, loss_mask: 0.2291, loss: 0.6953 2024-05-28 22:35:01,503 - mmdet - INFO - Epoch [7][5550/7330] lr: 1.000e-04, eta: 9:57:52, time: 0.914, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0411, loss_cls: 0.1716, acc: 93.6772, loss_bbox: 0.2218, loss_mask: 0.2276, loss: 0.6772 2024-05-28 22:35:48,260 - mmdet - INFO - Epoch [7][5600/7330] lr: 1.000e-04, eta: 9:57:05, time: 0.935, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0436, loss_cls: 0.1858, acc: 93.2156, loss_bbox: 0.2358, loss_mask: 0.2350, loss: 0.7170 2024-05-28 22:36:33,743 - mmdet - INFO - Epoch [7][5650/7330] lr: 1.000e-04, eta: 9:56:17, time: 0.910, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0401, loss_cls: 0.1748, acc: 93.4695, loss_bbox: 0.2254, loss_mask: 0.2281, loss: 0.6839 2024-05-28 22:37:22,834 - mmdet - INFO - Epoch [7][5700/7330] lr: 1.000e-04, eta: 9:55:33, time: 0.982, data_time: 0.038, memory: 20925, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0416, loss_cls: 0.1779, acc: 93.4126, loss_bbox: 0.2286, loss_mask: 0.2328, loss: 0.6975 2024-05-28 22:38:13,234 - mmdet - INFO - Epoch [7][5750/7330] lr: 1.000e-04, eta: 9:54:49, time: 1.008, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0443, loss_cls: 0.1785, acc: 93.3179, loss_bbox: 0.2294, loss_mask: 0.2301, loss: 0.6994 2024-05-28 22:38:59,443 - mmdet - INFO - Epoch [7][5800/7330] lr: 1.000e-04, eta: 9:54:02, time: 0.924, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0439, loss_cls: 0.1829, acc: 93.0986, loss_bbox: 0.2363, loss_mask: 0.2360, loss: 0.7166 2024-05-28 22:39:44,916 - mmdet - INFO - Epoch [7][5850/7330] lr: 1.000e-04, eta: 9:53:14, time: 0.909, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0421, loss_cls: 0.1744, acc: 93.6106, loss_bbox: 0.2216, loss_mask: 0.2262, loss: 0.6810 2024-05-28 22:40:30,975 - mmdet - INFO - Epoch [7][5900/7330] lr: 1.000e-04, eta: 9:52:27, time: 0.921, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0438, loss_cls: 0.1754, acc: 93.4778, loss_bbox: 0.2311, loss_mask: 0.2283, loss: 0.6957 2024-05-28 22:41:17,064 - mmdet - INFO - Epoch [7][5950/7330] lr: 1.000e-04, eta: 9:51:40, time: 0.922, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0454, loss_cls: 0.1852, acc: 93.1401, loss_bbox: 0.2331, loss_mask: 0.2333, loss: 0.7135 2024-05-28 22:42:02,830 - mmdet - INFO - Epoch [7][6000/7330] lr: 1.000e-04, eta: 9:50:53, time: 0.915, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0427, loss_cls: 0.1827, acc: 93.3118, loss_bbox: 0.2294, loss_mask: 0.2291, loss: 0.6999 2024-05-28 22:42:48,879 - mmdet - INFO - Epoch [7][6050/7330] lr: 1.000e-04, eta: 9:50:05, time: 0.921, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0445, loss_cls: 0.1873, acc: 93.0112, loss_bbox: 0.2357, loss_mask: 0.2332, loss: 0.7194 2024-05-28 22:43:34,982 - mmdet - INFO - Epoch [7][6100/7330] lr: 1.000e-04, eta: 9:49:18, time: 0.922, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0427, loss_cls: 0.1799, acc: 93.3733, loss_bbox: 0.2287, loss_mask: 0.2327, loss: 0.7013 2024-05-28 22:44:23,420 - mmdet - INFO - Epoch [7][6150/7330] lr: 1.000e-04, eta: 9:48:33, time: 0.969, data_time: 0.038, memory: 20925, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0463, loss_cls: 0.1790, acc: 93.3359, loss_bbox: 0.2301, loss_mask: 0.2333, loss: 0.7056 2024-05-28 22:45:10,194 - mmdet - INFO - Epoch [7][6200/7330] lr: 1.000e-04, eta: 9:47:46, time: 0.935, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0454, loss_cls: 0.1836, acc: 93.1455, loss_bbox: 0.2318, loss_mask: 0.2297, loss: 0.7095 2024-05-28 22:45:58,362 - mmdet - INFO - Epoch [7][6250/7330] lr: 1.000e-04, eta: 9:47:01, time: 0.963, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0418, loss_cls: 0.1773, acc: 93.4834, loss_bbox: 0.2262, loss_mask: 0.2327, loss: 0.6955 2024-05-28 22:46:43,532 - mmdet - INFO - Epoch [7][6300/7330] lr: 1.000e-04, eta: 9:46:13, time: 0.903, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0397, loss_cls: 0.1680, acc: 93.8687, loss_bbox: 0.2166, loss_mask: 0.2277, loss: 0.6682 2024-05-28 22:47:32,823 - mmdet - INFO - Epoch [7][6350/7330] lr: 1.000e-04, eta: 9:45:28, time: 0.986, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0459, loss_cls: 0.1880, acc: 93.0322, loss_bbox: 0.2352, loss_mask: 0.2375, loss: 0.7251 2024-05-28 22:48:18,823 - mmdet - INFO - Epoch [7][6400/7330] lr: 1.000e-04, eta: 9:44:41, time: 0.920, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0444, loss_cls: 0.1796, acc: 93.4128, loss_bbox: 0.2306, loss_mask: 0.2287, loss: 0.7020 2024-05-28 22:49:04,773 - mmdet - INFO - Epoch [7][6450/7330] lr: 1.000e-04, eta: 9:43:54, time: 0.919, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0408, loss_cls: 0.1721, acc: 93.6570, loss_bbox: 0.2202, loss_mask: 0.2269, loss: 0.6760 2024-05-28 22:49:52,792 - mmdet - INFO - Epoch [7][6500/7330] lr: 1.000e-04, eta: 9:43:08, time: 0.960, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0421, loss_cls: 0.1741, acc: 93.5718, loss_bbox: 0.2226, loss_mask: 0.2266, loss: 0.6815 2024-05-28 22:50:40,707 - mmdet - INFO - Epoch [7][6550/7330] lr: 1.000e-04, eta: 9:42:22, time: 0.958, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0425, loss_cls: 0.1825, acc: 93.2466, loss_bbox: 0.2269, loss_mask: 0.2288, loss: 0.6984 2024-05-28 22:51:26,217 - mmdet - INFO - Epoch [7][6600/7330] lr: 1.000e-04, eta: 9:41:35, time: 0.910, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0431, loss_cls: 0.1794, acc: 93.3784, loss_bbox: 0.2252, loss_mask: 0.2304, loss: 0.6969 2024-05-28 22:52:11,823 - mmdet - INFO - Epoch [7][6650/7330] lr: 1.000e-04, eta: 9:40:47, time: 0.912, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0419, loss_cls: 0.1794, acc: 93.2622, loss_bbox: 0.2290, loss_mask: 0.2316, loss: 0.6967 2024-05-28 22:52:57,471 - mmdet - INFO - Epoch [7][6700/7330] lr: 1.000e-04, eta: 9:40:00, time: 0.913, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0410, loss_cls: 0.1779, acc: 93.2446, loss_bbox: 0.2258, loss_mask: 0.2319, loss: 0.6921 2024-05-28 22:53:46,299 - mmdet - INFO - Epoch [7][6750/7330] lr: 1.000e-04, eta: 9:39:15, time: 0.976, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0410, loss_cls: 0.1728, acc: 93.6230, loss_bbox: 0.2157, loss_mask: 0.2257, loss: 0.6702 2024-05-28 22:54:36,784 - mmdet - INFO - Epoch [7][6800/7330] lr: 1.000e-04, eta: 9:38:31, time: 1.010, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0426, loss_cls: 0.1801, acc: 93.3816, loss_bbox: 0.2307, loss_mask: 0.2318, loss: 0.7013 2024-05-28 22:55:22,026 - mmdet - INFO - Epoch [7][6850/7330] lr: 1.000e-04, eta: 9:37:43, time: 0.905, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0427, loss_cls: 0.1834, acc: 93.3213, loss_bbox: 0.2348, loss_mask: 0.2333, loss: 0.7131 2024-05-28 22:56:07,687 - mmdet - INFO - Epoch [7][6900/7330] lr: 1.000e-04, eta: 9:36:56, time: 0.913, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0411, loss_cls: 0.1707, acc: 93.7400, loss_bbox: 0.2211, loss_mask: 0.2272, loss: 0.6746 2024-05-28 22:56:53,809 - mmdet - INFO - Epoch [7][6950/7330] lr: 1.000e-04, eta: 9:36:09, time: 0.922, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0425, loss_cls: 0.1838, acc: 93.2617, loss_bbox: 0.2384, loss_mask: 0.2339, loss: 0.7147 2024-05-28 22:57:39,932 - mmdet - INFO - Epoch [7][7000/7330] lr: 1.000e-04, eta: 9:35:22, time: 0.922, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0388, loss_cls: 0.1721, acc: 93.6833, loss_bbox: 0.2226, loss_mask: 0.2270, loss: 0.6773 2024-05-28 22:58:25,332 - mmdet - INFO - Epoch [7][7050/7330] lr: 1.000e-04, eta: 9:34:34, time: 0.908, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0428, loss_cls: 0.1757, acc: 93.5432, loss_bbox: 0.2202, loss_mask: 0.2245, loss: 0.6807 2024-05-28 22:59:11,853 - mmdet - INFO - Epoch [7][7100/7330] lr: 1.000e-04, eta: 9:33:47, time: 0.930, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0428, loss_cls: 0.1750, acc: 93.4409, loss_bbox: 0.2228, loss_mask: 0.2265, loss: 0.6834 2024-05-28 22:59:57,739 - mmdet - INFO - Epoch [7][7150/7330] lr: 1.000e-04, eta: 9:33:00, time: 0.918, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0400, loss_cls: 0.1711, acc: 93.6138, loss_bbox: 0.2216, loss_mask: 0.2255, loss: 0.6748 2024-05-28 23:00:46,414 - mmdet - INFO - Epoch [7][7200/7330] lr: 1.000e-04, eta: 9:32:15, time: 0.974, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0430, loss_cls: 0.1836, acc: 93.2896, loss_bbox: 0.2281, loss_mask: 0.2255, loss: 0.6973 2024-05-28 23:01:31,721 - mmdet - INFO - Epoch [7][7250/7330] lr: 1.000e-04, eta: 9:31:27, time: 0.906, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0392, loss_cls: 0.1832, acc: 93.1870, loss_bbox: 0.2313, loss_mask: 0.2325, loss: 0.7016 2024-05-28 23:02:19,618 - mmdet - INFO - Epoch [7][7300/7330] lr: 1.000e-04, eta: 9:30:41, time: 0.958, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0428, loss_cls: 0.1830, acc: 93.1802, loss_bbox: 0.2317, loss_mask: 0.2349, loss: 0.7101 2024-05-28 23:02:48,081 - mmdet - INFO - Saving checkpoint at 7 epochs 2024-05-28 23:04:59,619 - mmdet - INFO - Evaluating bbox... 2024-05-28 23:05:26,356 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.472 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.709 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.522 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.292 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.514 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.636 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.398 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.632 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.755 2024-05-28 23:05:26,356 - mmdet - INFO - Evaluating segm... 2024-05-28 23:05:51,483 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.424 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.673 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.453 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.214 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.463 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.641 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.536 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.536 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.536 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.331 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.580 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.726 2024-05-28 23:05:51,859 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-28 23:05:51,861 - mmdet - INFO - Epoch(val) [7][625] bbox_mAP: 0.4720, bbox_mAP_50: 0.7090, bbox_mAP_75: 0.5220, bbox_mAP_s: 0.2920, bbox_mAP_m: 0.5140, bbox_mAP_l: 0.6360, bbox_mAP_copypaste: 0.472 0.709 0.522 0.292 0.514 0.636, segm_mAP: 0.4240, segm_mAP_50: 0.6730, segm_mAP_75: 0.4530, segm_mAP_s: 0.2140, segm_mAP_m: 0.4630, segm_mAP_l: 0.6410, segm_mAP_copypaste: 0.424 0.673 0.453 0.214 0.463 0.641 2024-05-28 23:06:47,703 - mmdet - INFO - Epoch [8][50/7330] lr: 1.000e-04, eta: 9:29:13, time: 1.116, data_time: 0.116, memory: 20925, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0405, loss_cls: 0.1710, acc: 93.6982, loss_bbox: 0.2189, loss_mask: 0.2193, loss: 0.6658 2024-05-28 23:07:33,359 - mmdet - INFO - Epoch [8][100/7330] lr: 1.000e-04, eta: 9:28:26, time: 0.913, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0401, loss_cls: 0.1697, acc: 93.6399, loss_bbox: 0.2202, loss_mask: 0.2215, loss: 0.6669 2024-05-28 23:08:26,083 - mmdet - INFO - Epoch [8][150/7330] lr: 1.000e-04, eta: 9:27:44, time: 1.054, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0434, loss_cls: 0.1695, acc: 93.5601, loss_bbox: 0.2263, loss_mask: 0.2256, loss: 0.6795 2024-05-28 23:09:11,799 - mmdet - INFO - Epoch [8][200/7330] lr: 1.000e-04, eta: 9:26:56, time: 0.914, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0405, loss_cls: 0.1663, acc: 93.6780, loss_bbox: 0.2198, loss_mask: 0.2214, loss: 0.6647 2024-05-28 23:09:56,940 - mmdet - INFO - Epoch [8][250/7330] lr: 1.000e-04, eta: 9:26:08, time: 0.903, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0400, loss_cls: 0.1643, acc: 93.8782, loss_bbox: 0.2148, loss_mask: 0.2240, loss: 0.6577 2024-05-28 23:10:42,537 - mmdet - INFO - Epoch [8][300/7330] lr: 1.000e-04, eta: 9:25:21, time: 0.912, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0405, loss_cls: 0.1663, acc: 93.7661, loss_bbox: 0.2191, loss_mask: 0.2258, loss: 0.6672 2024-05-28 23:11:28,837 - mmdet - INFO - Epoch [8][350/7330] lr: 1.000e-04, eta: 9:24:34, time: 0.926, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0410, loss_cls: 0.1663, acc: 93.8074, loss_bbox: 0.2195, loss_mask: 0.2263, loss: 0.6682 2024-05-28 23:12:14,920 - mmdet - INFO - Epoch [8][400/7330] lr: 1.000e-04, eta: 9:23:47, time: 0.922, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0417, loss_cls: 0.1742, acc: 93.4492, loss_bbox: 0.2264, loss_mask: 0.2239, loss: 0.6817 2024-05-28 23:13:00,306 - mmdet - INFO - Epoch [8][450/7330] lr: 1.000e-04, eta: 9:23:00, time: 0.908, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0405, loss_cls: 0.1658, acc: 93.8164, loss_bbox: 0.2144, loss_mask: 0.2219, loss: 0.6573 2024-05-28 23:13:46,290 - mmdet - INFO - Epoch [8][500/7330] lr: 1.000e-04, eta: 9:22:12, time: 0.920, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0409, loss_cls: 0.1619, acc: 94.0537, loss_bbox: 0.2091, loss_mask: 0.2212, loss: 0.6471 2024-05-28 23:14:31,726 - mmdet - INFO - Epoch [8][550/7330] lr: 1.000e-04, eta: 9:21:25, time: 0.909, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0377, loss_cls: 0.1604, acc: 94.0310, loss_bbox: 0.2103, loss_mask: 0.2205, loss: 0.6422 2024-05-28 23:15:17,691 - mmdet - INFO - Epoch [8][600/7330] lr: 1.000e-04, eta: 9:20:38, time: 0.919, data_time: 0.067, memory: 20925, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0420, loss_cls: 0.1720, acc: 93.6599, loss_bbox: 0.2225, loss_mask: 0.2240, loss: 0.6753 2024-05-28 23:16:03,068 - mmdet - INFO - Epoch [8][650/7330] lr: 1.000e-04, eta: 9:19:50, time: 0.907, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0417, loss_cls: 0.1688, acc: 93.6191, loss_bbox: 0.2233, loss_mask: 0.2288, loss: 0.6786 2024-05-28 23:16:48,898 - mmdet - INFO - Epoch [8][700/7330] lr: 1.000e-04, eta: 9:19:03, time: 0.917, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0414, loss_cls: 0.1717, acc: 93.5000, loss_bbox: 0.2243, loss_mask: 0.2282, loss: 0.6808 2024-05-28 23:17:36,974 - mmdet - INFO - Epoch [8][750/7330] lr: 1.000e-04, eta: 9:18:17, time: 0.962, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0422, loss_cls: 0.1704, acc: 93.6567, loss_bbox: 0.2165, loss_mask: 0.2250, loss: 0.6683 2024-05-28 23:18:22,958 - mmdet - INFO - Epoch [8][800/7330] lr: 1.000e-04, eta: 9:17:30, time: 0.920, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0412, loss_cls: 0.1689, acc: 93.6128, loss_bbox: 0.2204, loss_mask: 0.2245, loss: 0.6706 2024-05-28 23:19:13,169 - mmdet - INFO - Epoch [8][850/7330] lr: 1.000e-04, eta: 9:16:46, time: 1.004, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0414, loss_cls: 0.1676, acc: 93.7900, loss_bbox: 0.2155, loss_mask: 0.2202, loss: 0.6602 2024-05-28 23:19:59,156 - mmdet - INFO - Epoch [8][900/7330] lr: 1.000e-04, eta: 9:15:59, time: 0.920, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0432, loss_cls: 0.1734, acc: 93.6001, loss_bbox: 0.2230, loss_mask: 0.2314, loss: 0.6874 2024-05-28 23:20:45,486 - mmdet - INFO - Epoch [8][950/7330] lr: 1.000e-04, eta: 9:15:12, time: 0.927, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0383, loss_cls: 0.1654, acc: 93.8425, loss_bbox: 0.2197, loss_mask: 0.2178, loss: 0.6551 2024-05-28 23:21:31,314 - mmdet - INFO - Epoch [8][1000/7330] lr: 1.000e-04, eta: 9:14:25, time: 0.917, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0413, loss_cls: 0.1706, acc: 93.6912, loss_bbox: 0.2169, loss_mask: 0.2237, loss: 0.6665 2024-05-28 23:22:19,962 - mmdet - INFO - Epoch [8][1050/7330] lr: 1.000e-04, eta: 9:13:40, time: 0.973, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0428, loss_cls: 0.1751, acc: 93.4036, loss_bbox: 0.2279, loss_mask: 0.2291, loss: 0.6906 2024-05-28 23:23:06,005 - mmdet - INFO - Epoch [8][1100/7330] lr: 1.000e-04, eta: 9:12:52, time: 0.921, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0413, loss_cls: 0.1734, acc: 93.4946, loss_bbox: 0.2227, loss_mask: 0.2273, loss: 0.6795 2024-05-28 23:23:52,184 - mmdet - INFO - Epoch [8][1150/7330] lr: 1.000e-04, eta: 9:12:05, time: 0.924, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0419, loss_cls: 0.1691, acc: 93.6143, loss_bbox: 0.2193, loss_mask: 0.2255, loss: 0.6717 2024-05-28 23:24:45,236 - mmdet - INFO - Epoch [8][1200/7330] lr: 1.000e-04, eta: 9:11:23, time: 1.061, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0414, loss_cls: 0.1703, acc: 93.6086, loss_bbox: 0.2189, loss_mask: 0.2302, loss: 0.6760 2024-05-28 23:25:34,037 - mmdet - INFO - Epoch [8][1250/7330] lr: 1.000e-04, eta: 9:10:38, time: 0.976, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0420, loss_cls: 0.1688, acc: 93.7207, loss_bbox: 0.2175, loss_mask: 0.2225, loss: 0.6655 2024-05-28 23:26:20,244 - mmdet - INFO - Epoch [8][1300/7330] lr: 1.000e-04, eta: 9:09:51, time: 0.925, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0418, loss_cls: 0.1713, acc: 93.5750, loss_bbox: 0.2175, loss_mask: 0.2209, loss: 0.6663 2024-05-28 23:27:06,010 - mmdet - INFO - Epoch [8][1350/7330] lr: 1.000e-04, eta: 9:09:04, time: 0.915, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0426, loss_cls: 0.1699, acc: 93.5725, loss_bbox: 0.2220, loss_mask: 0.2248, loss: 0.6754 2024-05-28 23:27:52,058 - mmdet - INFO - Epoch [8][1400/7330] lr: 1.000e-04, eta: 9:08:17, time: 0.921, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0441, loss_cls: 0.1734, acc: 93.4756, loss_bbox: 0.2267, loss_mask: 0.2294, loss: 0.6899 2024-05-28 23:28:37,819 - mmdet - INFO - Epoch [8][1450/7330] lr: 1.000e-04, eta: 9:07:29, time: 0.915, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0425, loss_cls: 0.1687, acc: 93.6870, loss_bbox: 0.2206, loss_mask: 0.2229, loss: 0.6703 2024-05-28 23:29:24,048 - mmdet - INFO - Epoch [8][1500/7330] lr: 1.000e-04, eta: 9:06:42, time: 0.925, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0440, loss_cls: 0.1729, acc: 93.5981, loss_bbox: 0.2195, loss_mask: 0.2229, loss: 0.6755 2024-05-28 23:30:10,235 - mmdet - INFO - Epoch [8][1550/7330] lr: 1.000e-04, eta: 9:05:55, time: 0.924, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0409, loss_cls: 0.1661, acc: 93.7683, loss_bbox: 0.2161, loss_mask: 0.2236, loss: 0.6614 2024-05-28 23:30:56,447 - mmdet - INFO - Epoch [8][1600/7330] lr: 1.000e-04, eta: 9:05:08, time: 0.924, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0428, loss_cls: 0.1743, acc: 93.6194, loss_bbox: 0.2211, loss_mask: 0.2265, loss: 0.6805 2024-05-28 23:31:42,625 - mmdet - INFO - Epoch [8][1650/7330] lr: 1.000e-04, eta: 9:04:21, time: 0.924, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0420, loss_cls: 0.1733, acc: 93.4702, loss_bbox: 0.2270, loss_mask: 0.2266, loss: 0.6847 2024-05-28 23:32:28,410 - mmdet - INFO - Epoch [8][1700/7330] lr: 1.000e-04, eta: 9:03:34, time: 0.916, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0421, loss_cls: 0.1683, acc: 93.7188, loss_bbox: 0.2185, loss_mask: 0.2212, loss: 0.6657 2024-05-28 23:33:15,445 - mmdet - INFO - Epoch [8][1750/7330] lr: 1.000e-04, eta: 9:02:48, time: 0.941, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0454, loss_cls: 0.1788, acc: 93.2769, loss_bbox: 0.2333, loss_mask: 0.2262, loss: 0.7000 2024-05-28 23:34:04,123 - mmdet - INFO - Epoch [8][1800/7330] lr: 1.000e-04, eta: 9:02:02, time: 0.973, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0418, loss_cls: 0.1705, acc: 93.6086, loss_bbox: 0.2198, loss_mask: 0.2256, loss: 0.6738 2024-05-28 23:34:50,351 - mmdet - INFO - Epoch [8][1850/7330] lr: 1.000e-04, eta: 9:01:16, time: 0.925, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0425, loss_cls: 0.1751, acc: 93.3938, loss_bbox: 0.2272, loss_mask: 0.2276, loss: 0.6880 2024-05-28 23:35:41,920 - mmdet - INFO - Epoch [8][1900/7330] lr: 1.000e-04, eta: 9:00:32, time: 1.031, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0399, loss_cls: 0.1724, acc: 93.5703, loss_bbox: 0.2231, loss_mask: 0.2277, loss: 0.6782 2024-05-28 23:36:28,187 - mmdet - INFO - Epoch [8][1950/7330] lr: 1.000e-04, eta: 8:59:45, time: 0.925, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0412, loss_cls: 0.1725, acc: 93.5581, loss_bbox: 0.2206, loss_mask: 0.2222, loss: 0.6725 2024-05-28 23:37:14,356 - mmdet - INFO - Epoch [8][2000/7330] lr: 1.000e-04, eta: 8:58:58, time: 0.923, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0422, loss_cls: 0.1745, acc: 93.5076, loss_bbox: 0.2256, loss_mask: 0.2242, loss: 0.6828 2024-05-28 23:37:59,659 - mmdet - INFO - Epoch [8][2050/7330] lr: 1.000e-04, eta: 8:58:11, time: 0.906, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0423, loss_cls: 0.1732, acc: 93.5586, loss_bbox: 0.2224, loss_mask: 0.2236, loss: 0.6784 2024-05-28 23:38:47,518 - mmdet - INFO - Epoch [8][2100/7330] lr: 1.000e-04, eta: 8:57:25, time: 0.957, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0446, loss_cls: 0.1760, acc: 93.5144, loss_bbox: 0.2247, loss_mask: 0.2316, loss: 0.6949 2024-05-28 23:39:33,036 - mmdet - INFO - Epoch [8][2150/7330] lr: 1.000e-04, eta: 8:56:37, time: 0.910, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0395, loss_cls: 0.1725, acc: 93.6836, loss_bbox: 0.2168, loss_mask: 0.2252, loss: 0.6698 2024-05-28 23:40:19,333 - mmdet - INFO - Epoch [8][2200/7330] lr: 1.000e-04, eta: 8:55:50, time: 0.926, data_time: 0.038, memory: 20925, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0425, loss_cls: 0.1681, acc: 93.7627, loss_bbox: 0.2193, loss_mask: 0.2225, loss: 0.6680 2024-05-28 23:41:14,716 - mmdet - INFO - Epoch [8][2250/7330] lr: 1.000e-04, eta: 8:55:09, time: 1.108, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0399, loss_cls: 0.1709, acc: 93.5923, loss_bbox: 0.2175, loss_mask: 0.2256, loss: 0.6691 2024-05-28 23:42:00,608 - mmdet - INFO - Epoch [8][2300/7330] lr: 1.000e-04, eta: 8:54:22, time: 0.918, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0398, loss_cls: 0.1673, acc: 93.8315, loss_bbox: 0.2119, loss_mask: 0.2171, loss: 0.6500 2024-05-28 23:42:45,813 - mmdet - INFO - Epoch [8][2350/7330] lr: 1.000e-04, eta: 8:53:35, time: 0.904, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0405, loss_cls: 0.1705, acc: 93.6243, loss_bbox: 0.2192, loss_mask: 0.2214, loss: 0.6671 2024-05-28 23:43:31,345 - mmdet - INFO - Epoch [8][2400/7330] lr: 1.000e-04, eta: 8:52:47, time: 0.911, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0381, loss_cls: 0.1602, acc: 94.0610, loss_bbox: 0.2091, loss_mask: 0.2196, loss: 0.6415 2024-05-28 23:44:16,736 - mmdet - INFO - Epoch [8][2450/7330] lr: 1.000e-04, eta: 8:52:00, time: 0.908, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0425, loss_cls: 0.1750, acc: 93.5505, loss_bbox: 0.2255, loss_mask: 0.2319, loss: 0.6908 2024-05-28 23:45:02,311 - mmdet - INFO - Epoch [8][2500/7330] lr: 1.000e-04, eta: 8:51:12, time: 0.911, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0416, loss_cls: 0.1701, acc: 93.6406, loss_bbox: 0.2210, loss_mask: 0.2296, loss: 0.6785 2024-05-28 23:45:48,539 - mmdet - INFO - Epoch [8][2550/7330] lr: 1.000e-04, eta: 8:50:25, time: 0.925, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0434, loss_cls: 0.1707, acc: 93.6414, loss_bbox: 0.2223, loss_mask: 0.2262, loss: 0.6769 2024-05-28 23:46:35,119 - mmdet - INFO - Epoch [8][2600/7330] lr: 1.000e-04, eta: 8:49:39, time: 0.932, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0450, loss_cls: 0.1722, acc: 93.5771, loss_bbox: 0.2262, loss_mask: 0.2323, loss: 0.6926 2024-05-28 23:47:20,867 - mmdet - INFO - Epoch [8][2650/7330] lr: 1.000e-04, eta: 8:48:51, time: 0.915, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0406, loss_cls: 0.1637, acc: 93.8755, loss_bbox: 0.2121, loss_mask: 0.2183, loss: 0.6502 2024-05-28 23:48:06,227 - mmdet - INFO - Epoch [8][2700/7330] lr: 1.000e-04, eta: 8:48:04, time: 0.907, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0403, loss_cls: 0.1614, acc: 93.9758, loss_bbox: 0.2094, loss_mask: 0.2200, loss: 0.6454 2024-05-28 23:48:52,763 - mmdet - INFO - Epoch [8][2750/7330] lr: 1.000e-04, eta: 8:47:17, time: 0.930, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0429, loss_cls: 0.1742, acc: 93.4468, loss_bbox: 0.2224, loss_mask: 0.2239, loss: 0.6784 2024-05-28 23:49:38,225 - mmdet - INFO - Epoch [8][2800/7330] lr: 1.000e-04, eta: 8:46:30, time: 0.910, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0428, loss_cls: 0.1745, acc: 93.5386, loss_bbox: 0.2202, loss_mask: 0.2293, loss: 0.6823 2024-05-28 23:50:26,606 - mmdet - INFO - Epoch [8][2850/7330] lr: 1.000e-04, eta: 8:45:44, time: 0.968, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0421, loss_cls: 0.1739, acc: 93.4727, loss_bbox: 0.2255, loss_mask: 0.2303, loss: 0.6883 2024-05-28 23:51:15,880 - mmdet - INFO - Epoch [8][2900/7330] lr: 1.000e-04, eta: 8:44:59, time: 0.985, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0399, loss_cls: 0.1583, acc: 94.1184, loss_bbox: 0.2065, loss_mask: 0.2206, loss: 0.6399 2024-05-28 23:52:03,919 - mmdet - INFO - Epoch [8][2950/7330] lr: 1.000e-04, eta: 8:44:13, time: 0.961, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0418, loss_cls: 0.1705, acc: 93.6311, loss_bbox: 0.2198, loss_mask: 0.2291, loss: 0.6765 2024-05-28 23:52:49,975 - mmdet - INFO - Epoch [8][3000/7330] lr: 1.000e-04, eta: 8:43:26, time: 0.921, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0429, loss_cls: 0.1829, acc: 93.1968, loss_bbox: 0.2294, loss_mask: 0.2333, loss: 0.7051 2024-05-28 23:53:35,814 - mmdet - INFO - Epoch [8][3050/7330] lr: 1.000e-04, eta: 8:42:39, time: 0.917, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0425, loss_cls: 0.1751, acc: 93.4084, loss_bbox: 0.2301, loss_mask: 0.2310, loss: 0.6954 2024-05-28 23:54:22,255 - mmdet - INFO - Epoch [8][3100/7330] lr: 1.000e-04, eta: 8:41:52, time: 0.929, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0434, loss_cls: 0.1724, acc: 93.5620, loss_bbox: 0.2199, loss_mask: 0.2256, loss: 0.6771 2024-05-28 23:55:10,263 - mmdet - INFO - Epoch [8][3150/7330] lr: 1.000e-04, eta: 8:41:06, time: 0.960, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0404, loss_cls: 0.1693, acc: 93.6533, loss_bbox: 0.2262, loss_mask: 0.2216, loss: 0.6740 2024-05-28 23:55:56,503 - mmdet - INFO - Epoch [8][3200/7330] lr: 1.000e-04, eta: 8:40:20, time: 0.925, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0406, loss_cls: 0.1659, acc: 93.7981, loss_bbox: 0.2147, loss_mask: 0.2244, loss: 0.6611 2024-05-28 23:56:42,335 - mmdet - INFO - Epoch [8][3250/7330] lr: 1.000e-04, eta: 8:39:32, time: 0.917, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0435, loss_cls: 0.1763, acc: 93.4275, loss_bbox: 0.2270, loss_mask: 0.2271, loss: 0.6908 2024-05-28 23:57:36,292 - mmdet - INFO - Epoch [8][3300/7330] lr: 1.000e-04, eta: 8:38:50, time: 1.079, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0420, loss_cls: 0.1747, acc: 93.4319, loss_bbox: 0.2280, loss_mask: 0.2290, loss: 0.6895 2024-05-28 23:58:24,433 - mmdet - INFO - Epoch [8][3350/7330] lr: 1.000e-04, eta: 8:38:04, time: 0.963, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0406, loss_cls: 0.1670, acc: 93.8269, loss_bbox: 0.2142, loss_mask: 0.2223, loss: 0.6597 2024-05-28 23:59:10,834 - mmdet - INFO - Epoch [8][3400/7330] lr: 1.000e-04, eta: 8:37:18, time: 0.928, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0437, loss_cls: 0.1754, acc: 93.4080, loss_bbox: 0.2237, loss_mask: 0.2271, loss: 0.6858 2024-05-28 23:59:56,953 - mmdet - INFO - Epoch [8][3450/7330] lr: 1.000e-04, eta: 8:36:31, time: 0.923, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0458, loss_cls: 0.1780, acc: 93.3540, loss_bbox: 0.2322, loss_mask: 0.2298, loss: 0.7018 2024-05-29 00:00:43,048 - mmdet - INFO - Epoch [8][3500/7330] lr: 1.000e-04, eta: 8:35:43, time: 0.922, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0419, loss_cls: 0.1728, acc: 93.6509, loss_bbox: 0.2183, loss_mask: 0.2236, loss: 0.6720 2024-05-29 00:01:29,982 - mmdet - INFO - Epoch [8][3550/7330] lr: 1.000e-04, eta: 8:34:57, time: 0.939, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0453, loss_cls: 0.1821, acc: 93.2212, loss_bbox: 0.2304, loss_mask: 0.2310, loss: 0.7060 2024-05-29 00:02:16,082 - mmdet - INFO - Epoch [8][3600/7330] lr: 1.000e-04, eta: 8:34:10, time: 0.922, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0395, loss_cls: 0.1718, acc: 93.6177, loss_bbox: 0.2174, loss_mask: 0.2233, loss: 0.6671 2024-05-29 00:03:01,795 - mmdet - INFO - Epoch [8][3650/7330] lr: 1.000e-04, eta: 8:33:23, time: 0.914, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0429, loss_cls: 0.1727, acc: 93.5837, loss_bbox: 0.2193, loss_mask: 0.2272, loss: 0.6773 2024-05-29 00:03:47,205 - mmdet - INFO - Epoch [8][3700/7330] lr: 1.000e-04, eta: 8:32:35, time: 0.908, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0394, loss_cls: 0.1667, acc: 93.8159, loss_bbox: 0.2145, loss_mask: 0.2237, loss: 0.6590 2024-05-29 00:04:32,336 - mmdet - INFO - Epoch [8][3750/7330] lr: 1.000e-04, eta: 8:31:48, time: 0.903, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0421, loss_cls: 0.1683, acc: 93.7720, loss_bbox: 0.2141, loss_mask: 0.2247, loss: 0.6640 2024-05-29 00:05:17,703 - mmdet - INFO - Epoch [8][3800/7330] lr: 1.000e-04, eta: 8:31:00, time: 0.907, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0430, loss_cls: 0.1711, acc: 93.6394, loss_bbox: 0.2167, loss_mask: 0.2258, loss: 0.6726 2024-05-29 00:06:03,972 - mmdet - INFO - Epoch [8][3850/7330] lr: 1.000e-04, eta: 8:30:13, time: 0.925, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0408, loss_cls: 0.1738, acc: 93.6116, loss_bbox: 0.2239, loss_mask: 0.2266, loss: 0.6803 2024-05-29 00:06:49,518 - mmdet - INFO - Epoch [8][3900/7330] lr: 1.000e-04, eta: 8:29:26, time: 0.911, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0400, loss_cls: 0.1677, acc: 93.8491, loss_bbox: 0.2161, loss_mask: 0.2255, loss: 0.6657 2024-05-29 00:07:40,333 - mmdet - INFO - Epoch [8][3950/7330] lr: 1.000e-04, eta: 8:28:42, time: 1.016, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0422, loss_cls: 0.1765, acc: 93.3625, loss_bbox: 0.2247, loss_mask: 0.2250, loss: 0.6854 2024-05-29 00:08:28,559 - mmdet - INFO - Epoch [8][4000/7330] lr: 1.000e-04, eta: 8:27:56, time: 0.965, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0390, loss_cls: 0.1624, acc: 94.0215, loss_bbox: 0.2131, loss_mask: 0.2260, loss: 0.6551 2024-05-29 00:09:14,123 - mmdet - INFO - Epoch [8][4050/7330] lr: 1.000e-04, eta: 8:27:09, time: 0.911, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0405, loss_cls: 0.1737, acc: 93.6458, loss_bbox: 0.2175, loss_mask: 0.2272, loss: 0.6739 2024-05-29 00:10:00,280 - mmdet - INFO - Epoch [8][4100/7330] lr: 1.000e-04, eta: 8:26:22, time: 0.923, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0441, loss_cls: 0.1863, acc: 93.0947, loss_bbox: 0.2390, loss_mask: 0.2333, loss: 0.7216 2024-05-29 00:10:45,989 - mmdet - INFO - Epoch [8][4150/7330] lr: 1.000e-04, eta: 8:25:35, time: 0.914, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0399, loss_cls: 0.1678, acc: 93.7979, loss_bbox: 0.2124, loss_mask: 0.2206, loss: 0.6566 2024-05-29 00:11:33,464 - mmdet - INFO - Epoch [8][4200/7330] lr: 1.000e-04, eta: 8:24:48, time: 0.949, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0393, loss_cls: 0.1693, acc: 93.7795, loss_bbox: 0.2182, loss_mask: 0.2233, loss: 0.6653 2024-05-29 00:12:19,774 - mmdet - INFO - Epoch [8][4250/7330] lr: 1.000e-04, eta: 8:24:01, time: 0.926, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0423, loss_cls: 0.1732, acc: 93.5552, loss_bbox: 0.2258, loss_mask: 0.2293, loss: 0.6873 2024-05-29 00:13:05,323 - mmdet - INFO - Epoch [8][4300/7330] lr: 1.000e-04, eta: 8:23:14, time: 0.911, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0387, loss_cls: 0.1631, acc: 93.8401, loss_bbox: 0.2139, loss_mask: 0.2237, loss: 0.6546 2024-05-29 00:13:55,960 - mmdet - INFO - Epoch [8][4350/7330] lr: 1.000e-04, eta: 8:22:30, time: 1.012, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0413, loss_cls: 0.1664, acc: 93.7498, loss_bbox: 0.2184, loss_mask: 0.2286, loss: 0.6709 2024-05-29 00:14:46,842 - mmdet - INFO - Epoch [8][4400/7330] lr: 1.000e-04, eta: 8:21:46, time: 1.018, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0421, loss_cls: 0.1814, acc: 93.2939, loss_bbox: 0.2319, loss_mask: 0.2311, loss: 0.7028 2024-05-29 00:15:32,196 - mmdet - INFO - Epoch [8][4450/7330] lr: 1.000e-04, eta: 8:20:58, time: 0.907, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0409, loss_cls: 0.1767, acc: 93.4504, loss_bbox: 0.2205, loss_mask: 0.2298, loss: 0.6828 2024-05-29 00:16:17,533 - mmdet - INFO - Epoch [8][4500/7330] lr: 1.000e-04, eta: 8:20:11, time: 0.907, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0385, loss_cls: 0.1633, acc: 93.9827, loss_bbox: 0.2063, loss_mask: 0.2175, loss: 0.6402 2024-05-29 00:17:03,182 - mmdet - INFO - Epoch [8][4550/7330] lr: 1.000e-04, eta: 8:19:23, time: 0.913, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0423, loss_cls: 0.1730, acc: 93.4868, loss_bbox: 0.2264, loss_mask: 0.2311, loss: 0.6885 2024-05-29 00:17:49,020 - mmdet - INFO - Epoch [8][4600/7330] lr: 1.000e-04, eta: 8:18:36, time: 0.917, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0419, loss_cls: 0.1757, acc: 93.4055, loss_bbox: 0.2287, loss_mask: 0.2235, loss: 0.6847 2024-05-29 00:18:34,701 - mmdet - INFO - Epoch [8][4650/7330] lr: 1.000e-04, eta: 8:17:49, time: 0.914, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0411, loss_cls: 0.1768, acc: 93.4526, loss_bbox: 0.2231, loss_mask: 0.2278, loss: 0.6850 2024-05-29 00:19:20,273 - mmdet - INFO - Epoch [8][4700/7330] lr: 1.000e-04, eta: 8:17:02, time: 0.911, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0407, loss_cls: 0.1698, acc: 93.6792, loss_bbox: 0.2212, loss_mask: 0.2266, loss: 0.6724 2024-05-29 00:20:05,985 - mmdet - INFO - Epoch [8][4750/7330] lr: 1.000e-04, eta: 8:16:14, time: 0.914, data_time: 0.035, memory: 20925, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0415, loss_cls: 0.1722, acc: 93.5518, loss_bbox: 0.2259, loss_mask: 0.2286, loss: 0.6834 2024-05-29 00:20:51,712 - mmdet - INFO - Epoch [8][4800/7330] lr: 1.000e-04, eta: 8:15:27, time: 0.915, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0433, loss_cls: 0.1735, acc: 93.4116, loss_bbox: 0.2306, loss_mask: 0.2279, loss: 0.6907 2024-05-29 00:21:37,197 - mmdet - INFO - Epoch [8][4850/7330] lr: 1.000e-04, eta: 8:14:40, time: 0.910, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0414, loss_cls: 0.1678, acc: 93.7344, loss_bbox: 0.2179, loss_mask: 0.2242, loss: 0.6664 2024-05-29 00:22:22,807 - mmdet - INFO - Epoch [8][4900/7330] lr: 1.000e-04, eta: 8:13:53, time: 0.912, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0424, loss_cls: 0.1680, acc: 93.7927, loss_bbox: 0.2161, loss_mask: 0.2264, loss: 0.6692 2024-05-29 00:23:09,014 - mmdet - INFO - Epoch [8][4950/7330] lr: 1.000e-04, eta: 8:13:06, time: 0.924, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0428, loss_cls: 0.1772, acc: 93.3081, loss_bbox: 0.2295, loss_mask: 0.2255, loss: 0.6913 2024-05-29 00:23:59,808 - mmdet - INFO - Epoch [8][5000/7330] lr: 1.000e-04, eta: 8:12:21, time: 1.016, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0438, loss_cls: 0.1733, acc: 93.5579, loss_bbox: 0.2242, loss_mask: 0.2317, loss: 0.6902 2024-05-29 00:24:47,682 - mmdet - INFO - Epoch [8][5050/7330] lr: 1.000e-04, eta: 8:11:35, time: 0.958, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0419, loss_cls: 0.1781, acc: 93.3123, loss_bbox: 0.2244, loss_mask: 0.2274, loss: 0.6874 2024-05-29 00:25:33,418 - mmdet - INFO - Epoch [8][5100/7330] lr: 1.000e-04, eta: 8:10:48, time: 0.915, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0430, loss_cls: 0.1788, acc: 93.3179, loss_bbox: 0.2276, loss_mask: 0.2284, loss: 0.6941 2024-05-29 00:26:18,911 - mmdet - INFO - Epoch [8][5150/7330] lr: 1.000e-04, eta: 8:10:01, time: 0.910, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0399, loss_cls: 0.1699, acc: 93.6528, loss_bbox: 0.2208, loss_mask: 0.2249, loss: 0.6708 2024-05-29 00:27:04,353 - mmdet - INFO - Epoch [8][5200/7330] lr: 1.000e-04, eta: 8:09:14, time: 0.909, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0413, loss_cls: 0.1686, acc: 93.6785, loss_bbox: 0.2180, loss_mask: 0.2236, loss: 0.6670 2024-05-29 00:27:51,931 - mmdet - INFO - Epoch [8][5250/7330] lr: 1.000e-04, eta: 8:08:27, time: 0.952, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0425, loss_cls: 0.1784, acc: 93.4004, loss_bbox: 0.2296, loss_mask: 0.2253, loss: 0.6925 2024-05-29 00:28:38,455 - mmdet - INFO - Epoch [8][5300/7330] lr: 1.000e-04, eta: 8:07:41, time: 0.930, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0441, loss_cls: 0.1844, acc: 93.0671, loss_bbox: 0.2328, loss_mask: 0.2336, loss: 0.7121 2024-05-29 00:29:23,907 - mmdet - INFO - Epoch [8][5350/7330] lr: 1.000e-04, eta: 8:06:53, time: 0.909, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0405, loss_cls: 0.1716, acc: 93.6235, loss_bbox: 0.2220, loss_mask: 0.2277, loss: 0.6772 2024-05-29 00:30:14,293 - mmdet - INFO - Epoch [8][5400/7330] lr: 1.000e-04, eta: 8:06:09, time: 1.007, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0422, loss_cls: 0.1802, acc: 93.2981, loss_bbox: 0.2288, loss_mask: 0.2335, loss: 0.7009 2024-05-29 00:31:04,787 - mmdet - INFO - Epoch [8][5450/7330] lr: 1.000e-04, eta: 8:05:24, time: 1.010, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0410, loss_cls: 0.1706, acc: 93.7017, loss_bbox: 0.2146, loss_mask: 0.2239, loss: 0.6674 2024-05-29 00:31:50,739 - mmdet - INFO - Epoch [8][5500/7330] lr: 1.000e-04, eta: 8:04:37, time: 0.919, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0412, loss_cls: 0.1709, acc: 93.5984, loss_bbox: 0.2191, loss_mask: 0.2242, loss: 0.6694 2024-05-29 00:32:37,215 - mmdet - INFO - Epoch [8][5550/7330] lr: 1.000e-04, eta: 8:03:50, time: 0.930, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0423, loss_cls: 0.1783, acc: 93.2617, loss_bbox: 0.2286, loss_mask: 0.2310, loss: 0.6961 2024-05-29 00:33:22,995 - mmdet - INFO - Epoch [8][5600/7330] lr: 1.000e-04, eta: 8:03:03, time: 0.916, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0423, loss_cls: 0.1856, acc: 93.1560, loss_bbox: 0.2316, loss_mask: 0.2346, loss: 0.7107 2024-05-29 00:34:08,432 - mmdet - INFO - Epoch [8][5650/7330] lr: 1.000e-04, eta: 8:02:16, time: 0.909, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0405, loss_cls: 0.1733, acc: 93.6272, loss_bbox: 0.2246, loss_mask: 0.2305, loss: 0.6842 2024-05-29 00:34:53,979 - mmdet - INFO - Epoch [8][5700/7330] lr: 1.000e-04, eta: 8:01:28, time: 0.911, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0380, loss_cls: 0.1621, acc: 93.8926, loss_bbox: 0.2102, loss_mask: 0.2224, loss: 0.6461 2024-05-29 00:35:39,816 - mmdet - INFO - Epoch [8][5750/7330] lr: 1.000e-04, eta: 8:00:41, time: 0.917, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0433, loss_cls: 0.1751, acc: 93.4773, loss_bbox: 0.2210, loss_mask: 0.2267, loss: 0.6823 2024-05-29 00:36:25,543 - mmdet - INFO - Epoch [8][5800/7330] lr: 1.000e-04, eta: 7:59:54, time: 0.915, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0404, loss_cls: 0.1704, acc: 93.6621, loss_bbox: 0.2204, loss_mask: 0.2269, loss: 0.6731 2024-05-29 00:37:11,851 - mmdet - INFO - Epoch [8][5850/7330] lr: 1.000e-04, eta: 7:59:07, time: 0.926, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0406, loss_cls: 0.1737, acc: 93.5715, loss_bbox: 0.2180, loss_mask: 0.2238, loss: 0.6703 2024-05-29 00:37:57,477 - mmdet - INFO - Epoch [8][5900/7330] lr: 1.000e-04, eta: 7:58:20, time: 0.912, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0435, loss_cls: 0.1750, acc: 93.3826, loss_bbox: 0.2275, loss_mask: 0.2335, loss: 0.6972 2024-05-29 00:38:43,305 - mmdet - INFO - Epoch [8][5950/7330] lr: 1.000e-04, eta: 7:57:33, time: 0.917, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0430, loss_cls: 0.1696, acc: 93.6252, loss_bbox: 0.2203, loss_mask: 0.2237, loss: 0.6716 2024-05-29 00:39:29,314 - mmdet - INFO - Epoch [8][6000/7330] lr: 1.000e-04, eta: 7:56:46, time: 0.920, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0433, loss_cls: 0.1693, acc: 93.7754, loss_bbox: 0.2179, loss_mask: 0.2231, loss: 0.6695 2024-05-29 00:40:19,726 - mmdet - INFO - Epoch [8][6050/7330] lr: 1.000e-04, eta: 7:56:01, time: 1.008, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0451, loss_cls: 0.1761, acc: 93.3657, loss_bbox: 0.2248, loss_mask: 0.2298, loss: 0.6936 2024-05-29 00:41:07,462 - mmdet - INFO - Epoch [8][6100/7330] lr: 1.000e-04, eta: 7:55:15, time: 0.955, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0418, loss_cls: 0.1737, acc: 93.5713, loss_bbox: 0.2226, loss_mask: 0.2313, loss: 0.6855 2024-05-29 00:41:53,113 - mmdet - INFO - Epoch [8][6150/7330] lr: 1.000e-04, eta: 7:54:28, time: 0.913, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0386, loss_cls: 0.1688, acc: 93.6716, loss_bbox: 0.2183, loss_mask: 0.2231, loss: 0.6624 2024-05-29 00:42:39,277 - mmdet - INFO - Epoch [8][6200/7330] lr: 1.000e-04, eta: 7:53:41, time: 0.923, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0445, loss_cls: 0.1773, acc: 93.3665, loss_bbox: 0.2310, loss_mask: 0.2284, loss: 0.6965 2024-05-29 00:43:25,074 - mmdet - INFO - Epoch [8][6250/7330] lr: 1.000e-04, eta: 7:52:54, time: 0.916, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0419, loss_cls: 0.1768, acc: 93.4819, loss_bbox: 0.2201, loss_mask: 0.2190, loss: 0.6743 2024-05-29 00:44:12,800 - mmdet - INFO - Epoch [8][6300/7330] lr: 1.000e-04, eta: 7:52:08, time: 0.955, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0426, loss_cls: 0.1700, acc: 93.6277, loss_bbox: 0.2207, loss_mask: 0.2235, loss: 0.6731 2024-05-29 00:44:58,420 - mmdet - INFO - Epoch [8][6350/7330] lr: 1.000e-04, eta: 7:51:21, time: 0.912, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0419, loss_cls: 0.1731, acc: 93.5833, loss_bbox: 0.2224, loss_mask: 0.2319, loss: 0.6859 2024-05-29 00:45:43,625 - mmdet - INFO - Epoch [8][6400/7330] lr: 1.000e-04, eta: 7:50:33, time: 0.904, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0395, loss_cls: 0.1757, acc: 93.3672, loss_bbox: 0.2306, loss_mask: 0.2298, loss: 0.6909 2024-05-29 00:46:31,871 - mmdet - INFO - Epoch [8][6450/7330] lr: 1.000e-04, eta: 7:49:47, time: 0.965, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0423, loss_cls: 0.1760, acc: 93.3347, loss_bbox: 0.2300, loss_mask: 0.2336, loss: 0.6987 2024-05-29 00:47:24,109 - mmdet - INFO - Epoch [8][6500/7330] lr: 1.000e-04, eta: 7:49:03, time: 1.045, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0418, loss_cls: 0.1780, acc: 93.3354, loss_bbox: 0.2358, loss_mask: 0.2284, loss: 0.7002 2024-05-29 00:48:09,610 - mmdet - INFO - Epoch [8][6550/7330] lr: 1.000e-04, eta: 7:48:16, time: 0.910, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0421, loss_cls: 0.1790, acc: 93.4417, loss_bbox: 0.2268, loss_mask: 0.2266, loss: 0.6900 2024-05-29 00:48:55,342 - mmdet - INFO - Epoch [8][6600/7330] lr: 1.000e-04, eta: 7:47:29, time: 0.915, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0410, loss_cls: 0.1697, acc: 93.6531, loss_bbox: 0.2203, loss_mask: 0.2228, loss: 0.6694 2024-05-29 00:49:41,232 - mmdet - INFO - Epoch [8][6650/7330] lr: 1.000e-04, eta: 7:46:42, time: 0.918, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0423, loss_cls: 0.1697, acc: 93.7046, loss_bbox: 0.2186, loss_mask: 0.2183, loss: 0.6652 2024-05-29 00:50:26,850 - mmdet - INFO - Epoch [8][6700/7330] lr: 1.000e-04, eta: 7:45:55, time: 0.912, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0426, loss_cls: 0.1695, acc: 93.6812, loss_bbox: 0.2206, loss_mask: 0.2287, loss: 0.6777 2024-05-29 00:51:12,484 - mmdet - INFO - Epoch [8][6750/7330] lr: 1.000e-04, eta: 7:45:08, time: 0.913, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0399, loss_cls: 0.1777, acc: 93.4692, loss_bbox: 0.2274, loss_mask: 0.2242, loss: 0.6846 2024-05-29 00:51:58,213 - mmdet - INFO - Epoch [8][6800/7330] lr: 1.000e-04, eta: 7:44:20, time: 0.915, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0448, loss_cls: 0.1776, acc: 93.4280, loss_bbox: 0.2327, loss_mask: 0.2280, loss: 0.6997 2024-05-29 00:52:43,597 - mmdet - INFO - Epoch [8][6850/7330] lr: 1.000e-04, eta: 7:43:33, time: 0.908, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0439, loss_cls: 0.1735, acc: 93.5698, loss_bbox: 0.2291, loss_mask: 0.2294, loss: 0.6923 2024-05-29 00:53:29,346 - mmdet - INFO - Epoch [8][6900/7330] lr: 1.000e-04, eta: 7:42:46, time: 0.915, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0413, loss_cls: 0.1699, acc: 93.6914, loss_bbox: 0.2185, loss_mask: 0.2227, loss: 0.6679 2024-05-29 00:54:15,099 - mmdet - INFO - Epoch [8][6950/7330] lr: 1.000e-04, eta: 7:41:59, time: 0.915, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0391, loss_cls: 0.1661, acc: 93.7563, loss_bbox: 0.2136, loss_mask: 0.2304, loss: 0.6627 2024-05-29 00:54:59,888 - mmdet - INFO - Epoch [8][7000/7330] lr: 1.000e-04, eta: 7:41:11, time: 0.896, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0374, loss_cls: 0.1605, acc: 94.1655, loss_bbox: 0.2029, loss_mask: 0.2132, loss: 0.6277 2024-05-29 00:55:45,945 - mmdet - INFO - Epoch [8][7050/7330] lr: 1.000e-04, eta: 7:40:24, time: 0.921, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0416, loss_cls: 0.1737, acc: 93.6099, loss_bbox: 0.2195, loss_mask: 0.2304, loss: 0.6809 2024-05-29 00:56:35,966 - mmdet - INFO - Epoch [8][7100/7330] lr: 1.000e-04, eta: 7:39:39, time: 1.000, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0400, loss_cls: 0.1612, acc: 93.9106, loss_bbox: 0.2121, loss_mask: 0.2214, loss: 0.6506 2024-05-29 00:57:24,725 - mmdet - INFO - Epoch [8][7150/7330] lr: 1.000e-04, eta: 7:38:54, time: 0.975, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0417, loss_cls: 0.1687, acc: 93.7229, loss_bbox: 0.2175, loss_mask: 0.2247, loss: 0.6678 2024-05-29 00:58:10,609 - mmdet - INFO - Epoch [8][7200/7330] lr: 1.000e-04, eta: 7:38:07, time: 0.918, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0400, loss_cls: 0.1680, acc: 93.6626, loss_bbox: 0.2161, loss_mask: 0.2232, loss: 0.6621 2024-05-29 00:58:56,810 - mmdet - INFO - Epoch [8][7250/7330] lr: 1.000e-04, eta: 7:37:20, time: 0.924, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0430, loss_cls: 0.1729, acc: 93.7427, loss_bbox: 0.2197, loss_mask: 0.2246, loss: 0.6772 2024-05-29 00:59:42,422 - mmdet - INFO - Epoch [8][7300/7330] lr: 1.000e-04, eta: 7:36:32, time: 0.912, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0395, loss_cls: 0.1678, acc: 93.7214, loss_bbox: 0.2165, loss_mask: 0.2285, loss: 0.6682 2024-05-29 01:00:10,384 - mmdet - INFO - Saving checkpoint at 8 epochs 2024-05-29 01:02:24,246 - mmdet - INFO - Evaluating bbox... 2024-05-29 01:02:47,104 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.476 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.711 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.530 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.299 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.520 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.641 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.593 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.593 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.593 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.402 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.636 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.756 2024-05-29 01:02:47,104 - mmdet - INFO - Evaluating segm... 2024-05-29 01:03:15,174 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.429 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.678 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.460 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.216 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.463 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.641 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.538 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.538 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.538 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.332 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.583 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.722 2024-05-29 01:03:15,561 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 01:03:15,563 - mmdet - INFO - Epoch(val) [8][625] bbox_mAP: 0.4760, bbox_mAP_50: 0.7110, bbox_mAP_75: 0.5300, bbox_mAP_s: 0.2990, bbox_mAP_m: 0.5200, bbox_mAP_l: 0.6410, bbox_mAP_copypaste: 0.476 0.711 0.530 0.299 0.520 0.641, segm_mAP: 0.4290, segm_mAP_50: 0.6780, segm_mAP_75: 0.4600, segm_mAP_s: 0.2160, segm_mAP_m: 0.4630, segm_mAP_l: 0.6410, segm_mAP_copypaste: 0.429 0.678 0.460 0.216 0.463 0.641 2024-05-29 01:04:07,596 - mmdet - INFO - Epoch [9][50/7330] lr: 1.000e-05, eta: 7:35:07, time: 1.040, data_time: 0.108, memory: 20925, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0379, loss_cls: 0.1607, acc: 93.9927, loss_bbox: 0.2126, loss_mask: 0.2224, loss: 0.6467 2024-05-29 01:04:53,724 - mmdet - INFO - Epoch [9][100/7330] lr: 1.000e-05, eta: 7:34:20, time: 0.923, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0384, loss_cls: 0.1602, acc: 93.9788, loss_bbox: 0.2118, loss_mask: 0.2156, loss: 0.6385 2024-05-29 01:05:41,803 - mmdet - INFO - Epoch [9][150/7330] lr: 1.000e-05, eta: 7:33:34, time: 0.962, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0393, loss_cls: 0.1582, acc: 93.9622, loss_bbox: 0.2115, loss_mask: 0.2224, loss: 0.6448 2024-05-29 01:06:32,332 - mmdet - INFO - Epoch [9][200/7330] lr: 1.000e-05, eta: 7:32:49, time: 1.011, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0393, loss_cls: 0.1591, acc: 93.9797, loss_bbox: 0.2089, loss_mask: 0.2188, loss: 0.6398 2024-05-29 01:07:19,627 - mmdet - INFO - Epoch [9][250/7330] lr: 1.000e-05, eta: 7:32:03, time: 0.946, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0373, loss_cls: 0.1584, acc: 94.0901, loss_bbox: 0.2089, loss_mask: 0.2187, loss: 0.6372 2024-05-29 01:08:05,518 - mmdet - INFO - Epoch [9][300/7330] lr: 1.000e-05, eta: 7:31:16, time: 0.918, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0412, loss_cls: 0.1665, acc: 93.6528, loss_bbox: 0.2175, loss_mask: 0.2192, loss: 0.6608 2024-05-29 01:08:51,254 - mmdet - INFO - Epoch [9][350/7330] lr: 1.000e-05, eta: 7:30:28, time: 0.915, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0392, loss_cls: 0.1576, acc: 94.0046, loss_bbox: 0.2092, loss_mask: 0.2157, loss: 0.6358 2024-05-29 01:09:36,872 - mmdet - INFO - Epoch [9][400/7330] lr: 1.000e-05, eta: 7:29:41, time: 0.912, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0387, loss_cls: 0.1602, acc: 93.9324, loss_bbox: 0.2112, loss_mask: 0.2161, loss: 0.6393 2024-05-29 01:10:22,769 - mmdet - INFO - Epoch [9][450/7330] lr: 1.000e-05, eta: 7:28:54, time: 0.918, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0405, loss_cls: 0.1610, acc: 93.8545, loss_bbox: 0.2119, loss_mask: 0.2147, loss: 0.6404 2024-05-29 01:11:08,881 - mmdet - INFO - Epoch [9][500/7330] lr: 1.000e-05, eta: 7:28:07, time: 0.922, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0371, loss_cls: 0.1593, acc: 93.8809, loss_bbox: 0.2135, loss_mask: 0.2200, loss: 0.6429 2024-05-29 01:11:54,763 - mmdet - INFO - Epoch [9][550/7330] lr: 1.000e-05, eta: 7:27:20, time: 0.918, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0357, loss_cls: 0.1542, acc: 94.1089, loss_bbox: 0.2032, loss_mask: 0.2192, loss: 0.6260 2024-05-29 01:12:40,320 - mmdet - INFO - Epoch [9][600/7330] lr: 1.000e-05, eta: 7:26:33, time: 0.911, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0371, loss_cls: 0.1516, acc: 94.2180, loss_bbox: 0.1985, loss_mask: 0.2081, loss: 0.6081 2024-05-29 01:13:26,245 - mmdet - INFO - Epoch [9][650/7330] lr: 1.000e-05, eta: 7:25:46, time: 0.918, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0387, loss_cls: 0.1526, acc: 94.1963, loss_bbox: 0.1987, loss_mask: 0.2100, loss: 0.6128 2024-05-29 01:14:12,078 - mmdet - INFO - Epoch [9][700/7330] lr: 1.000e-05, eta: 7:24:59, time: 0.917, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0385, loss_cls: 0.1535, acc: 94.2239, loss_bbox: 0.2059, loss_mask: 0.2124, loss: 0.6234 2024-05-29 01:14:58,569 - mmdet - INFO - Epoch [9][750/7330] lr: 1.000e-05, eta: 7:24:12, time: 0.930, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0409, loss_cls: 0.1640, acc: 93.7273, loss_bbox: 0.2184, loss_mask: 0.2178, loss: 0.6553 2024-05-29 01:15:44,844 - mmdet - INFO - Epoch [9][800/7330] lr: 1.000e-05, eta: 7:23:26, time: 0.925, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0407, loss_cls: 0.1629, acc: 93.8418, loss_bbox: 0.2115, loss_mask: 0.2169, loss: 0.6466 2024-05-29 01:16:30,931 - mmdet - INFO - Epoch [9][850/7330] lr: 1.000e-05, eta: 7:22:39, time: 0.922, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0379, loss_cls: 0.1548, acc: 94.0417, loss_bbox: 0.2082, loss_mask: 0.2148, loss: 0.6290 2024-05-29 01:17:16,714 - mmdet - INFO - Epoch [9][900/7330] lr: 1.000e-05, eta: 7:21:52, time: 0.916, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0387, loss_cls: 0.1588, acc: 93.9370, loss_bbox: 0.2107, loss_mask: 0.2200, loss: 0.6401 2024-05-29 01:18:02,488 - mmdet - INFO - Epoch [9][950/7330] lr: 1.000e-05, eta: 7:21:05, time: 0.915, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0384, loss_cls: 0.1591, acc: 94.0544, loss_bbox: 0.2048, loss_mask: 0.2142, loss: 0.6298 2024-05-29 01:18:47,791 - mmdet - INFO - Epoch [9][1000/7330] lr: 1.000e-05, eta: 7:20:17, time: 0.906, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0375, loss_cls: 0.1566, acc: 94.0767, loss_bbox: 0.2053, loss_mask: 0.2134, loss: 0.6260 2024-05-29 01:19:33,583 - mmdet - INFO - Epoch [9][1050/7330] lr: 1.000e-05, eta: 7:19:30, time: 0.916, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0399, loss_cls: 0.1551, acc: 94.1292, loss_bbox: 0.2032, loss_mask: 0.2155, loss: 0.6265 2024-05-29 01:20:21,205 - mmdet - INFO - Epoch [9][1100/7330] lr: 1.000e-05, eta: 7:18:44, time: 0.952, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0387, loss_cls: 0.1542, acc: 94.1853, loss_bbox: 0.2050, loss_mask: 0.2114, loss: 0.6218 2024-05-29 01:21:06,643 - mmdet - INFO - Epoch [9][1150/7330] lr: 1.000e-05, eta: 7:17:57, time: 0.909, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0373, loss_cls: 0.1480, acc: 94.4214, loss_bbox: 0.1984, loss_mask: 0.2157, loss: 0.6118 2024-05-29 01:21:54,998 - mmdet - INFO - Epoch [9][1200/7330] lr: 1.000e-05, eta: 7:17:11, time: 0.967, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0408, loss_cls: 0.1643, acc: 93.7393, loss_bbox: 0.2150, loss_mask: 0.2163, loss: 0.6499 2024-05-29 01:22:45,013 - mmdet - INFO - Epoch [9][1250/7330] lr: 1.000e-05, eta: 7:16:26, time: 1.000, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0387, loss_cls: 0.1549, acc: 94.0942, loss_bbox: 0.2029, loss_mask: 0.2145, loss: 0.6242 2024-05-29 01:23:32,837 - mmdet - INFO - Epoch [9][1300/7330] lr: 1.000e-05, eta: 7:15:40, time: 0.956, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0353, loss_cls: 0.1510, acc: 94.2153, loss_bbox: 0.1986, loss_mask: 0.2134, loss: 0.6116 2024-05-29 01:24:22,198 - mmdet - INFO - Epoch [9][1350/7330] lr: 1.000e-05, eta: 7:14:55, time: 0.987, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0436, loss_cls: 0.1671, acc: 93.6130, loss_bbox: 0.2206, loss_mask: 0.2250, loss: 0.6705 2024-05-29 01:25:08,079 - mmdet - INFO - Epoch [9][1400/7330] lr: 1.000e-05, eta: 7:14:08, time: 0.918, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0394, loss_cls: 0.1609, acc: 93.9043, loss_bbox: 0.2115, loss_mask: 0.2163, loss: 0.6415 2024-05-29 01:25:59,511 - mmdet - INFO - Epoch [9][1450/7330] lr: 1.000e-05, eta: 7:13:23, time: 1.028, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0378, loss_cls: 0.1573, acc: 94.1094, loss_bbox: 0.2047, loss_mask: 0.2148, loss: 0.6278 2024-05-29 01:26:45,609 - mmdet - INFO - Epoch [9][1500/7330] lr: 1.000e-05, eta: 7:12:36, time: 0.922, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0386, loss_cls: 0.1498, acc: 94.3291, loss_bbox: 0.1959, loss_mask: 0.2136, loss: 0.6112 2024-05-29 01:27:31,297 - mmdet - INFO - Epoch [9][1550/7330] lr: 1.000e-05, eta: 7:11:49, time: 0.914, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0350, loss_cls: 0.1484, acc: 94.3496, loss_bbox: 0.1999, loss_mask: 0.2098, loss: 0.6051 2024-05-29 01:28:16,312 - mmdet - INFO - Epoch [9][1600/7330] lr: 1.000e-05, eta: 7:11:02, time: 0.900, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0382, loss_cls: 0.1468, acc: 94.3896, loss_bbox: 0.2013, loss_mask: 0.2175, loss: 0.6155 2024-05-29 01:29:02,028 - mmdet - INFO - Epoch [9][1650/7330] lr: 1.000e-05, eta: 7:10:15, time: 0.914, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0394, loss_cls: 0.1610, acc: 93.9590, loss_bbox: 0.2084, loss_mask: 0.2170, loss: 0.6384 2024-05-29 01:29:48,069 - mmdet - INFO - Epoch [9][1700/7330] lr: 1.000e-05, eta: 7:09:28, time: 0.921, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0376, loss_cls: 0.1547, acc: 94.0542, loss_bbox: 0.2060, loss_mask: 0.2089, loss: 0.6199 2024-05-29 01:30:34,410 - mmdet - INFO - Epoch [9][1750/7330] lr: 1.000e-05, eta: 7:08:41, time: 0.927, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0422, loss_cls: 0.1576, acc: 94.0281, loss_bbox: 0.2055, loss_mask: 0.2162, loss: 0.6351 2024-05-29 01:31:20,497 - mmdet - INFO - Epoch [9][1800/7330] lr: 1.000e-05, eta: 7:07:54, time: 0.922, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0373, loss_cls: 0.1501, acc: 94.2620, loss_bbox: 0.1999, loss_mask: 0.2132, loss: 0.6121 2024-05-29 01:32:06,270 - mmdet - INFO - Epoch [9][1850/7330] lr: 1.000e-05, eta: 7:07:07, time: 0.915, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0381, loss_cls: 0.1601, acc: 93.8984, loss_bbox: 0.2099, loss_mask: 0.2168, loss: 0.6371 2024-05-29 01:32:52,049 - mmdet - INFO - Epoch [9][1900/7330] lr: 1.000e-05, eta: 7:06:20, time: 0.916, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0393, loss_cls: 0.1612, acc: 93.8794, loss_bbox: 0.2107, loss_mask: 0.2136, loss: 0.6385 2024-05-29 01:33:38,094 - mmdet - INFO - Epoch [9][1950/7330] lr: 1.000e-05, eta: 7:05:33, time: 0.921, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0391, loss_cls: 0.1604, acc: 93.8865, loss_bbox: 0.2118, loss_mask: 0.2136, loss: 0.6376 2024-05-29 01:34:24,125 - mmdet - INFO - Epoch [9][2000/7330] lr: 1.000e-05, eta: 7:04:46, time: 0.921, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0388, loss_cls: 0.1566, acc: 94.0293, loss_bbox: 0.2094, loss_mask: 0.2194, loss: 0.6378 2024-05-29 01:35:10,140 - mmdet - INFO - Epoch [9][2050/7330] lr: 1.000e-05, eta: 7:03:59, time: 0.920, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0378, loss_cls: 0.1566, acc: 94.0933, loss_bbox: 0.2022, loss_mask: 0.2100, loss: 0.6191 2024-05-29 01:35:56,349 - mmdet - INFO - Epoch [9][2100/7330] lr: 1.000e-05, eta: 7:03:12, time: 0.924, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0403, loss_cls: 0.1598, acc: 93.9067, loss_bbox: 0.2125, loss_mask: 0.2182, loss: 0.6453 2024-05-29 01:36:45,223 - mmdet - INFO - Epoch [9][2150/7330] lr: 1.000e-05, eta: 7:02:27, time: 0.978, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0383, loss_cls: 0.1544, acc: 94.1299, loss_bbox: 0.2072, loss_mask: 0.2177, loss: 0.6293 2024-05-29 01:37:31,209 - mmdet - INFO - Epoch [9][2200/7330] lr: 1.000e-05, eta: 7:01:40, time: 0.920, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0367, loss_cls: 0.1542, acc: 94.0552, loss_bbox: 0.2053, loss_mask: 0.2138, loss: 0.6224 2024-05-29 01:38:19,443 - mmdet - INFO - Epoch [9][2250/7330] lr: 1.000e-05, eta: 7:00:54, time: 0.965, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0395, loss_cls: 0.1596, acc: 93.9287, loss_bbox: 0.2070, loss_mask: 0.2162, loss: 0.6353 2024-05-29 01:39:10,025 - mmdet - INFO - Epoch [9][2300/7330] lr: 1.000e-05, eta: 7:00:09, time: 1.012, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0359, loss_cls: 0.1484, acc: 94.2703, loss_bbox: 0.1974, loss_mask: 0.2102, loss: 0.6039 2024-05-29 01:39:57,782 - mmdet - INFO - Epoch [9][2350/7330] lr: 1.000e-05, eta: 6:59:23, time: 0.955, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0369, loss_cls: 0.1498, acc: 94.2190, loss_bbox: 0.2009, loss_mask: 0.2145, loss: 0.6134 2024-05-29 01:40:47,617 - mmdet - INFO - Epoch [9][2400/7330] lr: 1.000e-05, eta: 6:58:37, time: 0.997, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0408, loss_cls: 0.1618, acc: 93.8704, loss_bbox: 0.2126, loss_mask: 0.2201, loss: 0.6498 2024-05-29 01:41:34,270 - mmdet - INFO - Epoch [9][2450/7330] lr: 1.000e-05, eta: 6:57:51, time: 0.933, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0385, loss_cls: 0.1549, acc: 94.1812, loss_bbox: 0.2034, loss_mask: 0.2127, loss: 0.6223 2024-05-29 01:42:25,452 - mmdet - INFO - Epoch [9][2500/7330] lr: 1.000e-05, eta: 6:57:06, time: 1.024, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0386, loss_cls: 0.1533, acc: 94.1453, loss_bbox: 0.2031, loss_mask: 0.2120, loss: 0.6194 2024-05-29 01:43:11,580 - mmdet - INFO - Epoch [9][2550/7330] lr: 1.000e-05, eta: 6:56:19, time: 0.923, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0370, loss_cls: 0.1500, acc: 94.3032, loss_bbox: 0.2029, loss_mask: 0.2147, loss: 0.6167 2024-05-29 01:43:56,860 - mmdet - INFO - Epoch [9][2600/7330] lr: 1.000e-05, eta: 6:55:32, time: 0.906, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0364, loss_cls: 0.1503, acc: 94.2695, loss_bbox: 0.2039, loss_mask: 0.2136, loss: 0.6164 2024-05-29 01:44:43,265 - mmdet - INFO - Epoch [9][2650/7330] lr: 1.000e-05, eta: 6:54:45, time: 0.928, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0404, loss_cls: 0.1584, acc: 93.9363, loss_bbox: 0.2121, loss_mask: 0.2168, loss: 0.6408 2024-05-29 01:45:29,038 - mmdet - INFO - Epoch [9][2700/7330] lr: 1.000e-05, eta: 6:53:58, time: 0.915, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0352, loss_cls: 0.1549, acc: 94.1262, loss_bbox: 0.2040, loss_mask: 0.2160, loss: 0.6233 2024-05-29 01:46:15,468 - mmdet - INFO - Epoch [9][2750/7330] lr: 1.000e-05, eta: 6:53:11, time: 0.929, data_time: 0.068, memory: 20925, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0404, loss_cls: 0.1607, acc: 93.9290, loss_bbox: 0.2110, loss_mask: 0.2151, loss: 0.6406 2024-05-29 01:47:01,046 - mmdet - INFO - Epoch [9][2800/7330] lr: 1.000e-05, eta: 6:52:24, time: 0.912, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0377, loss_cls: 0.1537, acc: 94.1018, loss_bbox: 0.2070, loss_mask: 0.2148, loss: 0.6261 2024-05-29 01:47:46,466 - mmdet - INFO - Epoch [9][2850/7330] lr: 1.000e-05, eta: 6:51:37, time: 0.908, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0384, loss_cls: 0.1528, acc: 94.1606, loss_bbox: 0.2048, loss_mask: 0.2167, loss: 0.6251 2024-05-29 01:48:32,499 - mmdet - INFO - Epoch [9][2900/7330] lr: 1.000e-05, eta: 6:50:50, time: 0.921, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0390, loss_cls: 0.1509, acc: 94.2388, loss_bbox: 0.1992, loss_mask: 0.2184, loss: 0.6210 2024-05-29 01:49:18,070 - mmdet - INFO - Epoch [9][2950/7330] lr: 1.000e-05, eta: 6:50:03, time: 0.911, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0386, loss_cls: 0.1525, acc: 94.1384, loss_bbox: 0.2002, loss_mask: 0.2143, loss: 0.6183 2024-05-29 01:50:03,970 - mmdet - INFO - Epoch [9][3000/7330] lr: 1.000e-05, eta: 6:49:16, time: 0.918, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0390, loss_cls: 0.1559, acc: 94.0779, loss_bbox: 0.2050, loss_mask: 0.2186, loss: 0.6305 2024-05-29 01:50:49,999 - mmdet - INFO - Epoch [9][3050/7330] lr: 1.000e-05, eta: 6:48:29, time: 0.921, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0390, loss_cls: 0.1569, acc: 94.0591, loss_bbox: 0.2124, loss_mask: 0.2162, loss: 0.6368 2024-05-29 01:51:35,369 - mmdet - INFO - Epoch [9][3100/7330] lr: 1.000e-05, eta: 6:47:42, time: 0.907, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0350, loss_cls: 0.1475, acc: 94.3879, loss_bbox: 0.1952, loss_mask: 0.2073, loss: 0.5958 2024-05-29 01:52:21,748 - mmdet - INFO - Epoch [9][3150/7330] lr: 1.000e-05, eta: 6:46:55, time: 0.928, data_time: 0.064, memory: 20925, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0412, loss_cls: 0.1566, acc: 93.9944, loss_bbox: 0.2111, loss_mask: 0.2144, loss: 0.6371 2024-05-29 01:53:09,741 - mmdet - INFO - Epoch [9][3200/7330] lr: 1.000e-05, eta: 6:46:09, time: 0.960, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0364, loss_cls: 0.1535, acc: 94.1580, loss_bbox: 0.2058, loss_mask: 0.2130, loss: 0.6208 2024-05-29 01:53:55,430 - mmdet - INFO - Epoch [9][3250/7330] lr: 1.000e-05, eta: 6:45:22, time: 0.914, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0385, loss_cls: 0.1441, acc: 94.4939, loss_bbox: 0.1920, loss_mask: 0.2059, loss: 0.5923 2024-05-29 01:54:43,591 - mmdet - INFO - Epoch [9][3300/7330] lr: 1.000e-05, eta: 6:44:36, time: 0.963, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0370, loss_cls: 0.1564, acc: 94.0286, loss_bbox: 0.2075, loss_mask: 0.2128, loss: 0.6265 2024-05-29 01:55:34,762 - mmdet - INFO - Epoch [9][3350/7330] lr: 1.000e-05, eta: 6:43:51, time: 1.023, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0364, loss_cls: 0.1448, acc: 94.4409, loss_bbox: 0.1943, loss_mask: 0.2095, loss: 0.5972 2024-05-29 01:56:23,443 - mmdet - INFO - Epoch [9][3400/7330] lr: 1.000e-05, eta: 6:43:06, time: 0.974, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0390, loss_cls: 0.1551, acc: 94.1313, loss_bbox: 0.2028, loss_mask: 0.2165, loss: 0.6274 2024-05-29 01:57:11,654 - mmdet - INFO - Epoch [9][3450/7330] lr: 1.000e-05, eta: 6:42:20, time: 0.964, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0376, loss_cls: 0.1556, acc: 94.1541, loss_bbox: 0.2064, loss_mask: 0.2200, loss: 0.6314 2024-05-29 01:57:57,246 - mmdet - INFO - Epoch [9][3500/7330] lr: 1.000e-05, eta: 6:41:32, time: 0.912, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0355, loss_cls: 0.1497, acc: 94.2620, loss_bbox: 0.1968, loss_mask: 0.2110, loss: 0.6046 2024-05-29 01:58:47,951 - mmdet - INFO - Epoch [9][3550/7330] lr: 1.000e-05, eta: 6:40:47, time: 1.014, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0381, loss_cls: 0.1526, acc: 94.1658, loss_bbox: 0.2036, loss_mask: 0.2110, loss: 0.6171 2024-05-29 01:59:33,597 - mmdet - INFO - Epoch [9][3600/7330] lr: 1.000e-05, eta: 6:40:00, time: 0.913, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0379, loss_cls: 0.1493, acc: 94.2896, loss_bbox: 0.1989, loss_mask: 0.2143, loss: 0.6126 2024-05-29 02:00:19,071 - mmdet - INFO - Epoch [9][3650/7330] lr: 1.000e-05, eta: 6:39:13, time: 0.909, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0383, loss_cls: 0.1549, acc: 94.0300, loss_bbox: 0.2073, loss_mask: 0.2199, loss: 0.6328 2024-05-29 02:01:04,800 - mmdet - INFO - Epoch [9][3700/7330] lr: 1.000e-05, eta: 6:38:26, time: 0.915, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0353, loss_cls: 0.1464, acc: 94.3708, loss_bbox: 0.1954, loss_mask: 0.2092, loss: 0.5994 2024-05-29 02:01:50,928 - mmdet - INFO - Epoch [9][3750/7330] lr: 1.000e-05, eta: 6:37:39, time: 0.923, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0375, loss_cls: 0.1481, acc: 94.3440, loss_bbox: 0.1956, loss_mask: 0.2083, loss: 0.6017 2024-05-29 02:02:36,542 - mmdet - INFO - Epoch [9][3800/7330] lr: 1.000e-05, eta: 6:36:52, time: 0.912, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0391, loss_cls: 0.1565, acc: 94.0972, loss_bbox: 0.2029, loss_mask: 0.2134, loss: 0.6244 2024-05-29 02:03:22,499 - mmdet - INFO - Epoch [9][3850/7330] lr: 1.000e-05, eta: 6:36:05, time: 0.919, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0391, loss_cls: 0.1562, acc: 94.0376, loss_bbox: 0.2094, loss_mask: 0.2185, loss: 0.6365 2024-05-29 02:04:08,506 - mmdet - INFO - Epoch [9][3900/7330] lr: 1.000e-05, eta: 6:35:18, time: 0.920, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0383, loss_cls: 0.1612, acc: 93.8035, loss_bbox: 0.2089, loss_mask: 0.2124, loss: 0.6340 2024-05-29 02:04:54,389 - mmdet - INFO - Epoch [9][3950/7330] lr: 1.000e-05, eta: 6:34:31, time: 0.918, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0362, loss_cls: 0.1522, acc: 94.2249, loss_bbox: 0.2051, loss_mask: 0.2119, loss: 0.6203 2024-05-29 02:05:41,512 - mmdet - INFO - Epoch [9][4000/7330] lr: 1.000e-05, eta: 6:33:45, time: 0.942, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0422, loss_cls: 0.1657, acc: 93.7336, loss_bbox: 0.2159, loss_mask: 0.2190, loss: 0.6575 2024-05-29 02:06:27,709 - mmdet - INFO - Epoch [9][4050/7330] lr: 1.000e-05, eta: 6:32:58, time: 0.924, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0353, loss_cls: 0.1475, acc: 94.3918, loss_bbox: 0.1941, loss_mask: 0.2095, loss: 0.5983 2024-05-29 02:07:13,927 - mmdet - INFO - Epoch [9][4100/7330] lr: 1.000e-05, eta: 6:32:11, time: 0.924, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0411, loss_cls: 0.1536, acc: 94.1472, loss_bbox: 0.2088, loss_mask: 0.2183, loss: 0.6345 2024-05-29 02:07:59,617 - mmdet - INFO - Epoch [9][4150/7330] lr: 1.000e-05, eta: 6:31:24, time: 0.914, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0357, loss_cls: 0.1513, acc: 94.1890, loss_bbox: 0.2060, loss_mask: 0.2109, loss: 0.6159 2024-05-29 02:08:45,270 - mmdet - INFO - Epoch [9][4200/7330] lr: 1.000e-05, eta: 6:30:37, time: 0.913, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0361, loss_cls: 0.1473, acc: 94.2947, loss_bbox: 0.2020, loss_mask: 0.2097, loss: 0.6072 2024-05-29 02:09:33,638 - mmdet - INFO - Epoch [9][4250/7330] lr: 1.000e-05, eta: 6:29:51, time: 0.967, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0368, loss_cls: 0.1509, acc: 94.2700, loss_bbox: 0.2009, loss_mask: 0.2107, loss: 0.6121 2024-05-29 02:10:18,882 - mmdet - INFO - Epoch [9][4300/7330] lr: 1.000e-05, eta: 6:29:04, time: 0.905, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0368, loss_cls: 0.1500, acc: 94.2542, loss_bbox: 0.1947, loss_mask: 0.2111, loss: 0.6053 2024-05-29 02:11:06,786 - mmdet - INFO - Epoch [9][4350/7330] lr: 1.000e-05, eta: 6:28:18, time: 0.958, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0364, loss_cls: 0.1499, acc: 94.3496, loss_bbox: 0.2001, loss_mask: 0.2096, loss: 0.6081 2024-05-29 02:11:57,632 - mmdet - INFO - Epoch [9][4400/7330] lr: 1.000e-05, eta: 6:27:33, time: 1.017, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0373, loss_cls: 0.1539, acc: 94.1550, loss_bbox: 0.2050, loss_mask: 0.2080, loss: 0.6172 2024-05-29 02:12:45,320 - mmdet - INFO - Epoch [9][4450/7330] lr: 1.000e-05, eta: 6:26:47, time: 0.954, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0364, loss_cls: 0.1522, acc: 94.2590, loss_bbox: 0.2070, loss_mask: 0.2144, loss: 0.6218 2024-05-29 02:13:34,972 - mmdet - INFO - Epoch [9][4500/7330] lr: 1.000e-05, eta: 6:26:01, time: 0.993, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0431, loss_cls: 0.1646, acc: 93.6973, loss_bbox: 0.2204, loss_mask: 0.2220, loss: 0.6655 2024-05-29 02:14:21,221 - mmdet - INFO - Epoch [9][4550/7330] lr: 1.000e-05, eta: 6:25:14, time: 0.925, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0407, loss_cls: 0.1663, acc: 93.6936, loss_bbox: 0.2177, loss_mask: 0.2179, loss: 0.6564 2024-05-29 02:15:12,337 - mmdet - INFO - Epoch [9][4600/7330] lr: 1.000e-05, eta: 6:24:29, time: 1.022, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0373, loss_cls: 0.1540, acc: 94.1147, loss_bbox: 0.2045, loss_mask: 0.2185, loss: 0.6266 2024-05-29 02:15:58,901 - mmdet - INFO - Epoch [9][4650/7330] lr: 1.000e-05, eta: 6:23:43, time: 0.931, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0375, loss_cls: 0.1593, acc: 93.9062, loss_bbox: 0.2132, loss_mask: 0.2204, loss: 0.6437 2024-05-29 02:16:44,997 - mmdet - INFO - Epoch [9][4700/7330] lr: 1.000e-05, eta: 6:22:56, time: 0.922, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0358, loss_cls: 0.1502, acc: 94.3821, loss_bbox: 0.1968, loss_mask: 0.2093, loss: 0.6040 2024-05-29 02:17:31,384 - mmdet - INFO - Epoch [9][4750/7330] lr: 1.000e-05, eta: 6:22:09, time: 0.928, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0373, loss_cls: 0.1592, acc: 93.9534, loss_bbox: 0.2079, loss_mask: 0.2165, loss: 0.6346 2024-05-29 02:18:17,322 - mmdet - INFO - Epoch [9][4800/7330] lr: 1.000e-05, eta: 6:21:22, time: 0.919, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0386, loss_cls: 0.1496, acc: 94.2268, loss_bbox: 0.2009, loss_mask: 0.2132, loss: 0.6149 2024-05-29 02:19:03,364 - mmdet - INFO - Epoch [9][4850/7330] lr: 1.000e-05, eta: 6:20:35, time: 0.921, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0377, loss_cls: 0.1523, acc: 94.2383, loss_bbox: 0.2001, loss_mask: 0.2104, loss: 0.6128 2024-05-29 02:19:49,803 - mmdet - INFO - Epoch [9][4900/7330] lr: 1.000e-05, eta: 6:19:48, time: 0.929, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0410, loss_cls: 0.1558, acc: 94.0337, loss_bbox: 0.2073, loss_mask: 0.2177, loss: 0.6365 2024-05-29 02:20:35,512 - mmdet - INFO - Epoch [9][4950/7330] lr: 1.000e-05, eta: 6:19:01, time: 0.914, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0368, loss_cls: 0.1473, acc: 94.3623, loss_bbox: 0.1953, loss_mask: 0.2120, loss: 0.6041 2024-05-29 02:21:21,227 - mmdet - INFO - Epoch [9][5000/7330] lr: 1.000e-05, eta: 6:18:14, time: 0.914, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0380, loss_cls: 0.1511, acc: 94.2048, loss_bbox: 0.1957, loss_mask: 0.2180, loss: 0.6144 2024-05-29 02:22:07,420 - mmdet - INFO - Epoch [9][5050/7330] lr: 1.000e-05, eta: 6:17:28, time: 0.924, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0386, loss_cls: 0.1557, acc: 94.0354, loss_bbox: 0.2071, loss_mask: 0.2181, loss: 0.6316 2024-05-29 02:22:53,503 - mmdet - INFO - Epoch [9][5100/7330] lr: 1.000e-05, eta: 6:16:41, time: 0.922, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0389, loss_cls: 0.1503, acc: 94.3247, loss_bbox: 0.2010, loss_mask: 0.2126, loss: 0.6157 2024-05-29 02:23:39,840 - mmdet - INFO - Epoch [9][5150/7330] lr: 1.000e-05, eta: 6:15:54, time: 0.927, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0388, loss_cls: 0.1536, acc: 94.1560, loss_bbox: 0.1976, loss_mask: 0.2132, loss: 0.6169 2024-05-29 02:24:26,025 - mmdet - INFO - Epoch [9][5200/7330] lr: 1.000e-05, eta: 6:15:07, time: 0.924, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0370, loss_cls: 0.1470, acc: 94.3936, loss_bbox: 0.1955, loss_mask: 0.2137, loss: 0.6055 2024-05-29 02:25:12,225 - mmdet - INFO - Epoch [9][5250/7330] lr: 1.000e-05, eta: 6:14:20, time: 0.924, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0387, loss_cls: 0.1532, acc: 94.1548, loss_bbox: 0.2060, loss_mask: 0.2100, loss: 0.6210 2024-05-29 02:26:01,747 - mmdet - INFO - Epoch [9][5300/7330] lr: 1.000e-05, eta: 6:13:35, time: 0.990, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0395, loss_cls: 0.1533, acc: 94.2251, loss_bbox: 0.2032, loss_mask: 0.2179, loss: 0.6275 2024-05-29 02:26:47,797 - mmdet - INFO - Epoch [9][5350/7330] lr: 1.000e-05, eta: 6:12:48, time: 0.921, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0383, loss_cls: 0.1532, acc: 94.1655, loss_bbox: 0.2095, loss_mask: 0.2177, loss: 0.6320 2024-05-29 02:27:35,746 - mmdet - INFO - Epoch [9][5400/7330] lr: 1.000e-05, eta: 6:12:02, time: 0.959, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0366, loss_cls: 0.1522, acc: 94.1360, loss_bbox: 0.2038, loss_mask: 0.2179, loss: 0.6227 2024-05-29 02:28:26,103 - mmdet - INFO - Epoch [9][5450/7330] lr: 1.000e-05, eta: 6:11:16, time: 1.007, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0377, loss_cls: 0.1556, acc: 94.0652, loss_bbox: 0.2111, loss_mask: 0.2182, loss: 0.6357 2024-05-29 02:29:14,942 - mmdet - INFO - Epoch [9][5500/7330] lr: 1.000e-05, eta: 6:10:30, time: 0.977, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0365, loss_cls: 0.1512, acc: 94.1926, loss_bbox: 0.2024, loss_mask: 0.2163, loss: 0.6182 2024-05-29 02:30:02,762 - mmdet - INFO - Epoch [9][5550/7330] lr: 1.000e-05, eta: 6:09:44, time: 0.956, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0357, loss_cls: 0.1490, acc: 94.2949, loss_bbox: 0.1994, loss_mask: 0.2143, loss: 0.6096 2024-05-29 02:30:48,423 - mmdet - INFO - Epoch [9][5600/7330] lr: 1.000e-05, eta: 6:08:57, time: 0.913, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0360, loss_cls: 0.1457, acc: 94.4294, loss_bbox: 0.1960, loss_mask: 0.2072, loss: 0.5965 2024-05-29 02:31:39,299 - mmdet - INFO - Epoch [9][5650/7330] lr: 1.000e-05, eta: 6:08:12, time: 1.018, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0380, loss_cls: 0.1521, acc: 94.1638, loss_bbox: 0.2076, loss_mask: 0.2129, loss: 0.6228 2024-05-29 02:32:25,601 - mmdet - INFO - Epoch [9][5700/7330] lr: 1.000e-05, eta: 6:07:25, time: 0.926, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0356, loss_cls: 0.1534, acc: 94.1106, loss_bbox: 0.2023, loss_mask: 0.2143, loss: 0.6190 2024-05-29 02:33:11,377 - mmdet - INFO - Epoch [9][5750/7330] lr: 1.000e-05, eta: 6:06:38, time: 0.916, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0395, loss_cls: 0.1603, acc: 93.9739, loss_bbox: 0.2117, loss_mask: 0.2194, loss: 0.6430 2024-05-29 02:33:57,050 - mmdet - INFO - Epoch [9][5800/7330] lr: 1.000e-05, eta: 6:05:51, time: 0.913, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0349, loss_cls: 0.1537, acc: 94.1589, loss_bbox: 0.2007, loss_mask: 0.2147, loss: 0.6164 2024-05-29 02:34:42,661 - mmdet - INFO - Epoch [9][5850/7330] lr: 1.000e-05, eta: 6:05:04, time: 0.912, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0356, loss_cls: 0.1493, acc: 94.2297, loss_bbox: 0.1985, loss_mask: 0.2123, loss: 0.6085 2024-05-29 02:35:28,112 - mmdet - INFO - Epoch [9][5900/7330] lr: 1.000e-05, eta: 6:04:17, time: 0.909, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0357, loss_cls: 0.1452, acc: 94.4390, loss_bbox: 0.1968, loss_mask: 0.2108, loss: 0.6002 2024-05-29 02:36:14,406 - mmdet - INFO - Epoch [9][5950/7330] lr: 1.000e-05, eta: 6:03:30, time: 0.926, data_time: 0.038, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0380, loss_cls: 0.1494, acc: 94.2959, loss_bbox: 0.1985, loss_mask: 0.2151, loss: 0.6129 2024-05-29 02:36:59,824 - mmdet - INFO - Epoch [9][6000/7330] lr: 1.000e-05, eta: 6:02:43, time: 0.908, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0385, loss_cls: 0.1537, acc: 94.1230, loss_bbox: 0.2030, loss_mask: 0.2131, loss: 0.6211 2024-05-29 02:37:45,253 - mmdet - INFO - Epoch [9][6050/7330] lr: 1.000e-05, eta: 6:01:56, time: 0.909, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0391, loss_cls: 0.1506, acc: 94.2888, loss_bbox: 0.2004, loss_mask: 0.2103, loss: 0.6130 2024-05-29 02:38:30,630 - mmdet - INFO - Epoch [9][6100/7330] lr: 1.000e-05, eta: 6:01:09, time: 0.908, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0369, loss_cls: 0.1515, acc: 94.2659, loss_bbox: 0.2056, loss_mask: 0.2146, loss: 0.6205 2024-05-29 02:39:16,866 - mmdet - INFO - Epoch [9][6150/7330] lr: 1.000e-05, eta: 6:00:22, time: 0.925, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0369, loss_cls: 0.1538, acc: 94.1550, loss_bbox: 0.2036, loss_mask: 0.2163, loss: 0.6233 2024-05-29 02:40:02,623 - mmdet - INFO - Epoch [9][6200/7330] lr: 1.000e-05, eta: 5:59:35, time: 0.915, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0372, loss_cls: 0.1519, acc: 94.2134, loss_bbox: 0.2017, loss_mask: 0.2156, loss: 0.6188 2024-05-29 02:40:48,287 - mmdet - INFO - Epoch [9][6250/7330] lr: 1.000e-05, eta: 5:58:48, time: 0.913, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0370, loss_cls: 0.1504, acc: 94.2695, loss_bbox: 0.2022, loss_mask: 0.2063, loss: 0.6085 2024-05-29 02:41:34,416 - mmdet - INFO - Epoch [9][6300/7330] lr: 1.000e-05, eta: 5:58:01, time: 0.923, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0365, loss_cls: 0.1561, acc: 94.1025, loss_bbox: 0.2075, loss_mask: 0.2127, loss: 0.6251 2024-05-29 02:42:23,389 - mmdet - INFO - Epoch [9][6350/7330] lr: 1.000e-05, eta: 5:57:15, time: 0.979, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0422, loss_cls: 0.1587, acc: 93.9529, loss_bbox: 0.2114, loss_mask: 0.2149, loss: 0.6405 2024-05-29 02:43:09,630 - mmdet - INFO - Epoch [9][6400/7330] lr: 1.000e-05, eta: 5:56:29, time: 0.925, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0391, loss_cls: 0.1435, acc: 94.5151, loss_bbox: 0.1925, loss_mask: 0.2067, loss: 0.5940 2024-05-29 02:43:57,858 - mmdet - INFO - Epoch [9][6450/7330] lr: 1.000e-05, eta: 5:55:43, time: 0.964, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0372, loss_cls: 0.1497, acc: 94.2166, loss_bbox: 0.2016, loss_mask: 0.2081, loss: 0.6096 2024-05-29 02:44:48,299 - mmdet - INFO - Epoch [9][6500/7330] lr: 1.000e-05, eta: 5:54:57, time: 1.009, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0394, loss_cls: 0.1521, acc: 94.1458, loss_bbox: 0.2040, loss_mask: 0.2135, loss: 0.6208 2024-05-29 02:45:36,515 - mmdet - INFO - Epoch [9][6550/7330] lr: 1.000e-05, eta: 5:54:11, time: 0.964, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0357, loss_cls: 0.1496, acc: 94.2620, loss_bbox: 0.2030, loss_mask: 0.2132, loss: 0.6132 2024-05-29 02:46:25,144 - mmdet - INFO - Epoch [9][6600/7330] lr: 1.000e-05, eta: 5:53:25, time: 0.973, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0412, loss_cls: 0.1542, acc: 94.0366, loss_bbox: 0.2070, loss_mask: 0.2138, loss: 0.6296 2024-05-29 02:47:11,280 - mmdet - INFO - Epoch [9][6650/7330] lr: 1.000e-05, eta: 5:52:38, time: 0.923, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0373, loss_cls: 0.1541, acc: 94.1514, loss_bbox: 0.2039, loss_mask: 0.2168, loss: 0.6245 2024-05-29 02:48:01,985 - mmdet - INFO - Epoch [9][6700/7330] lr: 1.000e-05, eta: 5:51:53, time: 1.014, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0368, loss_cls: 0.1487, acc: 94.3372, loss_bbox: 0.2002, loss_mask: 0.2111, loss: 0.6091 2024-05-29 02:48:48,125 - mmdet - INFO - Epoch [9][6750/7330] lr: 1.000e-05, eta: 5:51:06, time: 0.923, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0376, loss_cls: 0.1521, acc: 94.2019, loss_bbox: 0.2020, loss_mask: 0.2102, loss: 0.6147 2024-05-29 02:49:34,346 - mmdet - INFO - Epoch [9][6800/7330] lr: 1.000e-05, eta: 5:50:19, time: 0.924, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0391, loss_cls: 0.1503, acc: 94.3342, loss_bbox: 0.1987, loss_mask: 0.2093, loss: 0.6095 2024-05-29 02:50:19,503 - mmdet - INFO - Epoch [9][6850/7330] lr: 1.000e-05, eta: 5:49:32, time: 0.903, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0363, loss_cls: 0.1426, acc: 94.5415, loss_bbox: 0.1936, loss_mask: 0.2088, loss: 0.5931 2024-05-29 02:51:05,409 - mmdet - INFO - Epoch [9][6900/7330] lr: 1.000e-05, eta: 5:48:45, time: 0.918, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0365, loss_cls: 0.1464, acc: 94.4553, loss_bbox: 0.1947, loss_mask: 0.2142, loss: 0.6038 2024-05-29 02:51:51,178 - mmdet - INFO - Epoch [9][6950/7330] lr: 1.000e-05, eta: 5:47:58, time: 0.915, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0398, loss_cls: 0.1526, acc: 94.1531, loss_bbox: 0.2008, loss_mask: 0.2162, loss: 0.6222 2024-05-29 02:52:36,711 - mmdet - INFO - Epoch [9][7000/7330] lr: 1.000e-05, eta: 5:47:11, time: 0.911, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0368, loss_cls: 0.1497, acc: 94.3452, loss_bbox: 0.2023, loss_mask: 0.2139, loss: 0.6145 2024-05-29 02:53:22,248 - mmdet - INFO - Epoch [9][7050/7330] lr: 1.000e-05, eta: 5:46:24, time: 0.911, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0376, loss_cls: 0.1507, acc: 94.2192, loss_bbox: 0.2040, loss_mask: 0.2134, loss: 0.6181 2024-05-29 02:54:08,623 - mmdet - INFO - Epoch [9][7100/7330] lr: 1.000e-05, eta: 5:45:37, time: 0.927, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0407, loss_cls: 0.1510, acc: 94.2131, loss_bbox: 0.2046, loss_mask: 0.2135, loss: 0.6223 2024-05-29 02:54:54,127 - mmdet - INFO - Epoch [9][7150/7330] lr: 1.000e-05, eta: 5:44:50, time: 0.910, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0356, loss_cls: 0.1494, acc: 94.3210, loss_bbox: 0.2017, loss_mask: 0.2123, loss: 0.6105 2024-05-29 02:55:39,752 - mmdet - INFO - Epoch [9][7200/7330] lr: 1.000e-05, eta: 5:44:03, time: 0.912, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0352, loss_cls: 0.1453, acc: 94.4216, loss_bbox: 0.2007, loss_mask: 0.2088, loss: 0.6018 2024-05-29 02:56:25,244 - mmdet - INFO - Epoch [9][7250/7330] lr: 1.000e-05, eta: 5:43:16, time: 0.910, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0356, loss_cls: 0.1501, acc: 94.3113, loss_bbox: 0.1981, loss_mask: 0.2101, loss: 0.6046 2024-05-29 02:57:10,955 - mmdet - INFO - Epoch [9][7300/7330] lr: 1.000e-05, eta: 5:42:29, time: 0.914, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0399, loss_cls: 0.1581, acc: 93.9646, loss_bbox: 0.2111, loss_mask: 0.2151, loss: 0.6373 2024-05-29 02:57:38,965 - mmdet - INFO - Saving checkpoint at 9 epochs 2024-05-29 02:59:48,362 - mmdet - INFO - Evaluating bbox... 2024-05-29 03:00:15,762 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.496 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.723 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.544 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.314 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.537 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.665 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.612 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.612 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.612 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.423 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.653 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.772 2024-05-29 03:00:15,762 - mmdet - INFO - Evaluating segm... 2024-05-29 03:00:39,442 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.442 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.690 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.474 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.226 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.477 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.657 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.349 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.593 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.729 2024-05-29 03:00:39,812 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 03:00:39,814 - mmdet - INFO - Epoch(val) [9][625] bbox_mAP: 0.4960, bbox_mAP_50: 0.7230, bbox_mAP_75: 0.5440, bbox_mAP_s: 0.3140, bbox_mAP_m: 0.5370, bbox_mAP_l: 0.6650, bbox_mAP_copypaste: 0.496 0.723 0.544 0.314 0.537 0.665, segm_mAP: 0.4420, segm_mAP_50: 0.6900, segm_mAP_75: 0.4740, segm_mAP_s: 0.2260, segm_mAP_m: 0.4770, segm_mAP_l: 0.6570, segm_mAP_copypaste: 0.442 0.690 0.474 0.226 0.477 0.657 2024-05-29 03:01:29,260 - mmdet - INFO - Epoch [10][50/7330] lr: 1.000e-05, eta: 5:41:06, time: 0.989, data_time: 0.122, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0351, loss_cls: 0.1398, acc: 94.6145, loss_bbox: 0.1903, loss_mask: 0.2073, loss: 0.5844 2024-05-29 03:02:18,403 - mmdet - INFO - Epoch [10][100/7330] lr: 1.000e-05, eta: 5:40:20, time: 0.983, data_time: 0.063, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0391, loss_cls: 0.1521, acc: 94.1223, loss_bbox: 0.2047, loss_mask: 0.2083, loss: 0.6164 2024-05-29 03:03:05,743 - mmdet - INFO - Epoch [10][150/7330] lr: 1.000e-05, eta: 5:39:34, time: 0.947, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0395, loss_cls: 0.1571, acc: 93.9758, loss_bbox: 0.2094, loss_mask: 0.2122, loss: 0.6300 2024-05-29 03:03:52,003 - mmdet - INFO - Epoch [10][200/7330] lr: 1.000e-05, eta: 5:38:47, time: 0.925, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0370, loss_cls: 0.1450, acc: 94.4763, loss_bbox: 0.1932, loss_mask: 0.2082, loss: 0.5949 2024-05-29 03:04:38,896 - mmdet - INFO - Epoch [10][250/7330] lr: 1.000e-05, eta: 5:38:01, time: 0.938, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0399, loss_cls: 0.1494, acc: 94.2700, loss_bbox: 0.2004, loss_mask: 0.2160, loss: 0.6184 2024-05-29 03:05:27,604 - mmdet - INFO - Epoch [10][300/7330] lr: 1.000e-05, eta: 5:37:15, time: 0.974, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0380, loss_cls: 0.1500, acc: 94.2842, loss_bbox: 0.2020, loss_mask: 0.2147, loss: 0.6166 2024-05-29 03:06:13,874 - mmdet - INFO - Epoch [10][350/7330] lr: 1.000e-05, eta: 5:36:28, time: 0.926, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0361, loss_cls: 0.1445, acc: 94.3953, loss_bbox: 0.1957, loss_mask: 0.2142, loss: 0.6034 2024-05-29 03:07:02,247 - mmdet - INFO - Epoch [10][400/7330] lr: 1.000e-05, eta: 5:35:42, time: 0.967, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0361, loss_cls: 0.1476, acc: 94.3735, loss_bbox: 0.1962, loss_mask: 0.2133, loss: 0.6045 2024-05-29 03:07:50,142 - mmdet - INFO - Epoch [10][450/7330] lr: 1.000e-05, eta: 5:34:56, time: 0.958, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0380, loss_cls: 0.1482, acc: 94.3579, loss_bbox: 0.1977, loss_mask: 0.2153, loss: 0.6106 2024-05-29 03:08:36,863 - mmdet - INFO - Epoch [10][500/7330] lr: 1.000e-05, eta: 5:34:09, time: 0.934, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0377, loss_cls: 0.1509, acc: 94.2461, loss_bbox: 0.1993, loss_mask: 0.2136, loss: 0.6131 2024-05-29 03:09:25,671 - mmdet - INFO - Epoch [10][550/7330] lr: 1.000e-05, eta: 5:33:23, time: 0.976, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0378, loss_cls: 0.1558, acc: 93.9949, loss_bbox: 0.2039, loss_mask: 0.2123, loss: 0.6216 2024-05-29 03:10:11,541 - mmdet - INFO - Epoch [10][600/7330] lr: 1.000e-05, eta: 5:32:36, time: 0.917, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0350, loss_cls: 0.1449, acc: 94.4365, loss_bbox: 0.1996, loss_mask: 0.2117, loss: 0.6019 2024-05-29 03:10:57,863 - mmdet - INFO - Epoch [10][650/7330] lr: 1.000e-05, eta: 5:31:49, time: 0.926, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0389, loss_cls: 0.1498, acc: 94.2241, loss_bbox: 0.2012, loss_mask: 0.2138, loss: 0.6157 2024-05-29 03:11:44,456 - mmdet - INFO - Epoch [10][700/7330] lr: 1.000e-05, eta: 5:31:03, time: 0.931, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0384, loss_cls: 0.1467, acc: 94.3696, loss_bbox: 0.1961, loss_mask: 0.2163, loss: 0.6095 2024-05-29 03:12:33,664 - mmdet - INFO - Epoch [10][750/7330] lr: 1.000e-05, eta: 5:30:17, time: 0.984, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0367, loss_cls: 0.1528, acc: 94.1416, loss_bbox: 0.2039, loss_mask: 0.2087, loss: 0.6133 2024-05-29 03:13:20,249 - mmdet - INFO - Epoch [10][800/7330] lr: 1.000e-05, eta: 5:29:30, time: 0.932, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0390, loss_cls: 0.1539, acc: 94.1584, loss_bbox: 0.2028, loss_mask: 0.2094, loss: 0.6175 2024-05-29 03:14:08,710 - mmdet - INFO - Epoch [10][850/7330] lr: 1.000e-05, eta: 5:28:44, time: 0.969, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0395, loss_cls: 0.1556, acc: 94.0210, loss_bbox: 0.2086, loss_mask: 0.2162, loss: 0.6324 2024-05-29 03:14:54,444 - mmdet - INFO - Epoch [10][900/7330] lr: 1.000e-05, eta: 5:27:57, time: 0.915, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0361, loss_cls: 0.1487, acc: 94.4128, loss_bbox: 0.1978, loss_mask: 0.2124, loss: 0.6062 2024-05-29 03:15:43,652 - mmdet - INFO - Epoch [10][950/7330] lr: 1.000e-05, eta: 5:27:11, time: 0.984, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0374, loss_cls: 0.1551, acc: 94.0391, loss_bbox: 0.2054, loss_mask: 0.2144, loss: 0.6256 2024-05-29 03:16:29,995 - mmdet - INFO - Epoch [10][1000/7330] lr: 1.000e-05, eta: 5:26:25, time: 0.927, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0363, loss_cls: 0.1505, acc: 94.2554, loss_bbox: 0.1995, loss_mask: 0.2099, loss: 0.6081 2024-05-29 03:17:16,077 - mmdet - INFO - Epoch [10][1050/7330] lr: 1.000e-05, eta: 5:25:38, time: 0.922, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0394, loss_cls: 0.1527, acc: 94.1931, loss_bbox: 0.2069, loss_mask: 0.2136, loss: 0.6250 2024-05-29 03:18:02,598 - mmdet - INFO - Epoch [10][1100/7330] lr: 1.000e-05, eta: 5:24:51, time: 0.930, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0364, loss_cls: 0.1413, acc: 94.5347, loss_bbox: 0.1948, loss_mask: 0.2122, loss: 0.5964 2024-05-29 03:18:50,507 - mmdet - INFO - Epoch [10][1150/7330] lr: 1.000e-05, eta: 5:24:05, time: 0.958, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0363, loss_cls: 0.1443, acc: 94.5056, loss_bbox: 0.1963, loss_mask: 0.2108, loss: 0.5992 2024-05-29 03:19:36,304 - mmdet - INFO - Epoch [10][1200/7330] lr: 1.000e-05, eta: 5:23:18, time: 0.916, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0349, loss_cls: 0.1429, acc: 94.5103, loss_bbox: 0.1932, loss_mask: 0.2083, loss: 0.5902 2024-05-29 03:20:22,657 - mmdet - INFO - Epoch [10][1250/7330] lr: 1.000e-05, eta: 5:22:31, time: 0.927, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0366, loss_cls: 0.1536, acc: 94.1313, loss_bbox: 0.2046, loss_mask: 0.2117, loss: 0.6173 2024-05-29 03:21:09,338 - mmdet - INFO - Epoch [10][1300/7330] lr: 1.000e-05, eta: 5:21:44, time: 0.934, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0371, loss_cls: 0.1505, acc: 94.2356, loss_bbox: 0.2036, loss_mask: 0.2074, loss: 0.6108 2024-05-29 03:21:57,192 - mmdet - INFO - Epoch [10][1350/7330] lr: 1.000e-05, eta: 5:20:58, time: 0.957, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0351, loss_cls: 0.1454, acc: 94.5117, loss_bbox: 0.1960, loss_mask: 0.2081, loss: 0.5959 2024-05-29 03:22:43,023 - mmdet - INFO - Epoch [10][1400/7330] lr: 1.000e-05, eta: 5:20:11, time: 0.917, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0362, loss_cls: 0.1448, acc: 94.4199, loss_bbox: 0.2007, loss_mask: 0.2141, loss: 0.6070 2024-05-29 03:23:31,177 - mmdet - INFO - Epoch [10][1450/7330] lr: 1.000e-05, eta: 5:19:25, time: 0.963, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0358, loss_cls: 0.1510, acc: 94.1829, loss_bbox: 0.2026, loss_mask: 0.2143, loss: 0.6156 2024-05-29 03:24:19,824 - mmdet - INFO - Epoch [10][1500/7330] lr: 1.000e-05, eta: 5:18:39, time: 0.973, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0383, loss_cls: 0.1473, acc: 94.3186, loss_bbox: 0.2018, loss_mask: 0.2124, loss: 0.6117 2024-05-29 03:25:05,832 - mmdet - INFO - Epoch [10][1550/7330] lr: 1.000e-05, eta: 5:17:52, time: 0.920, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0346, loss_cls: 0.1388, acc: 94.6118, loss_bbox: 0.1902, loss_mask: 0.2074, loss: 0.5821 2024-05-29 03:25:54,360 - mmdet - INFO - Epoch [10][1600/7330] lr: 1.000e-05, eta: 5:17:06, time: 0.971, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0360, loss_cls: 0.1494, acc: 94.2583, loss_bbox: 0.2015, loss_mask: 0.2158, loss: 0.6154 2024-05-29 03:26:40,561 - mmdet - INFO - Epoch [10][1650/7330] lr: 1.000e-05, eta: 5:16:19, time: 0.924, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0368, loss_cls: 0.1474, acc: 94.3506, loss_bbox: 0.1974, loss_mask: 0.2073, loss: 0.6001 2024-05-29 03:27:26,086 - mmdet - INFO - Epoch [10][1700/7330] lr: 1.000e-05, eta: 5:15:32, time: 0.911, data_time: 0.033, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0365, loss_cls: 0.1512, acc: 94.1960, loss_bbox: 0.2004, loss_mask: 0.2095, loss: 0.6095 2024-05-29 03:28:14,588 - mmdet - INFO - Epoch [10][1750/7330] lr: 1.000e-05, eta: 5:14:46, time: 0.970, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0341, loss_cls: 0.1394, acc: 94.5938, loss_bbox: 0.1914, loss_mask: 0.2076, loss: 0.5834 2024-05-29 03:29:01,345 - mmdet - INFO - Epoch [10][1800/7330] lr: 1.000e-05, eta: 5:14:00, time: 0.935, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0372, loss_cls: 0.1527, acc: 94.2231, loss_bbox: 0.2014, loss_mask: 0.2084, loss: 0.6116 2024-05-29 03:29:46,882 - mmdet - INFO - Epoch [10][1850/7330] lr: 1.000e-05, eta: 5:13:13, time: 0.911, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0354, loss_cls: 0.1521, acc: 94.2046, loss_bbox: 0.1983, loss_mask: 0.2125, loss: 0.6115 2024-05-29 03:30:36,418 - mmdet - INFO - Epoch [10][1900/7330] lr: 1.000e-05, eta: 5:12:27, time: 0.991, data_time: 0.065, memory: 20925, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0405, loss_cls: 0.1521, acc: 94.0735, loss_bbox: 0.2092, loss_mask: 0.2185, loss: 0.6336 2024-05-29 03:31:22,285 - mmdet - INFO - Epoch [10][1950/7330] lr: 1.000e-05, eta: 5:11:40, time: 0.917, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0369, loss_cls: 0.1503, acc: 94.1909, loss_bbox: 0.2020, loss_mask: 0.2110, loss: 0.6125 2024-05-29 03:32:12,773 - mmdet - INFO - Epoch [10][2000/7330] lr: 1.000e-05, eta: 5:10:54, time: 1.009, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0374, loss_cls: 0.1534, acc: 94.1433, loss_bbox: 0.1996, loss_mask: 0.2145, loss: 0.6170 2024-05-29 03:32:58,343 - mmdet - INFO - Epoch [10][2050/7330] lr: 1.000e-05, eta: 5:10:07, time: 0.912, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0375, loss_cls: 0.1478, acc: 94.3640, loss_bbox: 0.1975, loss_mask: 0.2137, loss: 0.6087 2024-05-29 03:33:44,622 - mmdet - INFO - Epoch [10][2100/7330] lr: 1.000e-05, eta: 5:09:21, time: 0.926, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0377, loss_cls: 0.1558, acc: 94.0540, loss_bbox: 0.2043, loss_mask: 0.2116, loss: 0.6216 2024-05-29 03:34:33,103 - mmdet - INFO - Epoch [10][2150/7330] lr: 1.000e-05, eta: 5:08:34, time: 0.970, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0382, loss_cls: 0.1470, acc: 94.3406, loss_bbox: 0.1998, loss_mask: 0.2091, loss: 0.6065 2024-05-29 03:35:19,561 - mmdet - INFO - Epoch [10][2200/7330] lr: 1.000e-05, eta: 5:07:48, time: 0.929, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0361, loss_cls: 0.1521, acc: 94.1416, loss_bbox: 0.2034, loss_mask: 0.2128, loss: 0.6160 2024-05-29 03:36:05,468 - mmdet - INFO - Epoch [10][2250/7330] lr: 1.000e-05, eta: 5:07:01, time: 0.918, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0382, loss_cls: 0.1521, acc: 94.1807, loss_bbox: 0.2023, loss_mask: 0.2213, loss: 0.6261 2024-05-29 03:36:51,440 - mmdet - INFO - Epoch [10][2300/7330] lr: 1.000e-05, eta: 5:06:14, time: 0.919, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0381, loss_cls: 0.1453, acc: 94.5356, loss_bbox: 0.1985, loss_mask: 0.2115, loss: 0.6048 2024-05-29 03:37:37,671 - mmdet - INFO - Epoch [10][2350/7330] lr: 1.000e-05, eta: 5:05:27, time: 0.925, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0382, loss_cls: 0.1568, acc: 93.9436, loss_bbox: 0.2107, loss_mask: 0.2168, loss: 0.6350 2024-05-29 03:38:26,946 - mmdet - INFO - Epoch [10][2400/7330] lr: 1.000e-05, eta: 5:04:41, time: 0.986, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0400, loss_cls: 0.1559, acc: 93.9473, loss_bbox: 0.2081, loss_mask: 0.2166, loss: 0.6333 2024-05-29 03:39:13,429 - mmdet - INFO - Epoch [10][2450/7330] lr: 1.000e-05, eta: 5:03:55, time: 0.930, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0401, loss_cls: 0.1583, acc: 94.0110, loss_bbox: 0.2121, loss_mask: 0.2143, loss: 0.6371 2024-05-29 03:40:01,722 - mmdet - INFO - Epoch [10][2500/7330] lr: 1.000e-05, eta: 5:03:08, time: 0.966, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0374, loss_cls: 0.1485, acc: 94.2725, loss_bbox: 0.2012, loss_mask: 0.2167, loss: 0.6151 2024-05-29 03:40:50,382 - mmdet - INFO - Epoch [10][2550/7330] lr: 1.000e-05, eta: 5:02:22, time: 0.973, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0391, loss_cls: 0.1476, acc: 94.4048, loss_bbox: 0.2013, loss_mask: 0.2123, loss: 0.6113 2024-05-29 03:41:36,682 - mmdet - INFO - Epoch [10][2600/7330] lr: 1.000e-05, eta: 5:01:36, time: 0.926, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0387, loss_cls: 0.1494, acc: 94.2935, loss_bbox: 0.2011, loss_mask: 0.2089, loss: 0.6100 2024-05-29 03:42:25,901 - mmdet - INFO - Epoch [10][2650/7330] lr: 1.000e-05, eta: 5:00:50, time: 0.984, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0357, loss_cls: 0.1418, acc: 94.4592, loss_bbox: 0.1960, loss_mask: 0.2105, loss: 0.5952 2024-05-29 03:43:11,880 - mmdet - INFO - Epoch [10][2700/7330] lr: 1.000e-05, eta: 5:00:03, time: 0.920, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0355, loss_cls: 0.1428, acc: 94.5823, loss_bbox: 0.1936, loss_mask: 0.2057, loss: 0.5888 2024-05-29 03:43:57,892 - mmdet - INFO - Epoch [10][2750/7330] lr: 1.000e-05, eta: 4:59:16, time: 0.920, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0411, loss_cls: 0.1508, acc: 94.2632, loss_bbox: 0.2001, loss_mask: 0.2110, loss: 0.6154 2024-05-29 03:44:46,518 - mmdet - INFO - Epoch [10][2800/7330] lr: 1.000e-05, eta: 4:58:30, time: 0.973, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0366, loss_cls: 0.1501, acc: 94.2422, loss_bbox: 0.2025, loss_mask: 0.2095, loss: 0.6106 2024-05-29 03:45:32,662 - mmdet - INFO - Epoch [10][2850/7330] lr: 1.000e-05, eta: 4:57:43, time: 0.923, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0378, loss_cls: 0.1449, acc: 94.3728, loss_bbox: 0.1983, loss_mask: 0.2126, loss: 0.6048 2024-05-29 03:46:19,478 - mmdet - INFO - Epoch [10][2900/7330] lr: 1.000e-05, eta: 4:56:56, time: 0.936, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0396, loss_cls: 0.1524, acc: 94.1340, loss_bbox: 0.2036, loss_mask: 0.2134, loss: 0.6216 2024-05-29 03:47:07,932 - mmdet - INFO - Epoch [10][2950/7330] lr: 1.000e-05, eta: 4:56:10, time: 0.969, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0390, loss_cls: 0.1536, acc: 94.1611, loss_bbox: 0.2006, loss_mask: 0.2107, loss: 0.6165 2024-05-29 03:47:56,614 - mmdet - INFO - Epoch [10][3000/7330] lr: 1.000e-05, eta: 4:55:24, time: 0.974, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0370, loss_cls: 0.1499, acc: 94.1541, loss_bbox: 0.1993, loss_mask: 0.2072, loss: 0.6050 2024-05-29 03:48:42,043 - mmdet - INFO - Epoch [10][3050/7330] lr: 1.000e-05, eta: 4:54:37, time: 0.909, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0364, loss_cls: 0.1454, acc: 94.3804, loss_bbox: 0.1966, loss_mask: 0.2183, loss: 0.6077 2024-05-29 03:49:27,935 - mmdet - INFO - Epoch [10][3100/7330] lr: 1.000e-05, eta: 4:53:50, time: 0.918, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0352, loss_cls: 0.1480, acc: 94.4155, loss_bbox: 0.1987, loss_mask: 0.2073, loss: 0.6014 2024-05-29 03:50:13,758 - mmdet - INFO - Epoch [10][3150/7330] lr: 1.000e-05, eta: 4:53:03, time: 0.916, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0357, loss_cls: 0.1485, acc: 94.2090, loss_bbox: 0.2023, loss_mask: 0.2168, loss: 0.6151 2024-05-29 03:51:02,041 - mmdet - INFO - Epoch [10][3200/7330] lr: 1.000e-05, eta: 4:52:17, time: 0.966, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0388, loss_cls: 0.1556, acc: 93.9895, loss_bbox: 0.2071, loss_mask: 0.2080, loss: 0.6208 2024-05-29 03:51:47,403 - mmdet - INFO - Epoch [10][3250/7330] lr: 1.000e-05, eta: 4:51:30, time: 0.907, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0361, loss_cls: 0.1446, acc: 94.4575, loss_bbox: 0.1922, loss_mask: 0.2135, loss: 0.5969 2024-05-29 03:52:33,309 - mmdet - INFO - Epoch [10][3300/7330] lr: 1.000e-05, eta: 4:50:43, time: 0.918, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0374, loss_cls: 0.1510, acc: 94.2170, loss_bbox: 0.2021, loss_mask: 0.2151, loss: 0.6179 2024-05-29 03:53:18,743 - mmdet - INFO - Epoch [10][3350/7330] lr: 1.000e-05, eta: 4:49:56, time: 0.909, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0356, loss_cls: 0.1412, acc: 94.6006, loss_bbox: 0.1908, loss_mask: 0.2097, loss: 0.5886 2024-05-29 03:54:05,031 - mmdet - INFO - Epoch [10][3400/7330] lr: 1.000e-05, eta: 4:49:09, time: 0.926, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0368, loss_cls: 0.1457, acc: 94.3713, loss_bbox: 0.1931, loss_mask: 0.2087, loss: 0.5965 2024-05-29 03:54:53,332 - mmdet - INFO - Epoch [10][3450/7330] lr: 1.000e-05, eta: 4:48:23, time: 0.966, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0376, loss_cls: 0.1440, acc: 94.4517, loss_bbox: 0.1941, loss_mask: 0.2086, loss: 0.5964 2024-05-29 03:55:39,443 - mmdet - INFO - Epoch [10][3500/7330] lr: 1.000e-05, eta: 4:47:36, time: 0.922, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0369, loss_cls: 0.1445, acc: 94.4631, loss_bbox: 0.1927, loss_mask: 0.2071, loss: 0.5934 2024-05-29 03:56:27,607 - mmdet - INFO - Epoch [10][3550/7330] lr: 1.000e-05, eta: 4:46:50, time: 0.963, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0386, loss_cls: 0.1548, acc: 94.0347, loss_bbox: 0.2086, loss_mask: 0.2130, loss: 0.6268 2024-05-29 03:57:15,667 - mmdet - INFO - Epoch [10][3600/7330] lr: 1.000e-05, eta: 4:46:04, time: 0.961, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0371, loss_cls: 0.1516, acc: 94.1616, loss_bbox: 0.1970, loss_mask: 0.2147, loss: 0.6134 2024-05-29 03:58:01,534 - mmdet - INFO - Epoch [10][3650/7330] lr: 1.000e-05, eta: 4:45:17, time: 0.917, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0367, loss_cls: 0.1493, acc: 94.2983, loss_bbox: 0.1983, loss_mask: 0.2116, loss: 0.6091 2024-05-29 03:58:50,364 - mmdet - INFO - Epoch [10][3700/7330] lr: 1.000e-05, eta: 4:44:31, time: 0.977, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0357, loss_cls: 0.1425, acc: 94.5498, loss_bbox: 0.1945, loss_mask: 0.2068, loss: 0.5904 2024-05-29 03:59:37,042 - mmdet - INFO - Epoch [10][3750/7330] lr: 1.000e-05, eta: 4:43:44, time: 0.934, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0379, loss_cls: 0.1528, acc: 94.1865, loss_bbox: 0.2025, loss_mask: 0.2129, loss: 0.6188 2024-05-29 04:00:22,981 - mmdet - INFO - Epoch [10][3800/7330] lr: 1.000e-05, eta: 4:42:57, time: 0.919, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0390, loss_cls: 0.1539, acc: 94.1963, loss_bbox: 0.2035, loss_mask: 0.2177, loss: 0.6266 2024-05-29 04:01:10,669 - mmdet - INFO - Epoch [10][3850/7330] lr: 1.000e-05, eta: 4:42:11, time: 0.954, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0348, loss_cls: 0.1447, acc: 94.5085, loss_bbox: 0.1966, loss_mask: 0.2055, loss: 0.5931 2024-05-29 04:01:56,694 - mmdet - INFO - Epoch [10][3900/7330] lr: 1.000e-05, eta: 4:41:24, time: 0.921, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0379, loss_cls: 0.1525, acc: 94.1255, loss_bbox: 0.2050, loss_mask: 0.2115, loss: 0.6192 2024-05-29 04:02:43,169 - mmdet - INFO - Epoch [10][3950/7330] lr: 1.000e-05, eta: 4:40:37, time: 0.930, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0362, loss_cls: 0.1506, acc: 94.2307, loss_bbox: 0.1999, loss_mask: 0.2120, loss: 0.6115 2024-05-29 04:03:31,298 - mmdet - INFO - Epoch [10][4000/7330] lr: 1.000e-05, eta: 4:39:51, time: 0.962, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0378, loss_cls: 0.1472, acc: 94.3901, loss_bbox: 0.2013, loss_mask: 0.2142, loss: 0.6125 2024-05-29 04:04:20,279 - mmdet - INFO - Epoch [10][4050/7330] lr: 1.000e-05, eta: 4:39:05, time: 0.980, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0382, loss_cls: 0.1512, acc: 94.1772, loss_bbox: 0.2034, loss_mask: 0.2125, loss: 0.6170 2024-05-29 04:05:06,759 - mmdet - INFO - Epoch [10][4100/7330] lr: 1.000e-05, eta: 4:38:18, time: 0.930, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0380, loss_cls: 0.1490, acc: 94.2410, loss_bbox: 0.2030, loss_mask: 0.2112, loss: 0.6127 2024-05-29 04:05:53,431 - mmdet - INFO - Epoch [10][4150/7330] lr: 1.000e-05, eta: 4:37:32, time: 0.933, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0376, loss_cls: 0.1531, acc: 94.0754, loss_bbox: 0.2042, loss_mask: 0.2118, loss: 0.6190 2024-05-29 04:06:39,768 - mmdet - INFO - Epoch [10][4200/7330] lr: 1.000e-05, eta: 4:36:45, time: 0.927, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0382, loss_cls: 0.1529, acc: 94.1187, loss_bbox: 0.2079, loss_mask: 0.2141, loss: 0.6263 2024-05-29 04:07:28,207 - mmdet - INFO - Epoch [10][4250/7330] lr: 1.000e-05, eta: 4:35:59, time: 0.969, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0379, loss_cls: 0.1487, acc: 94.3140, loss_bbox: 0.1959, loss_mask: 0.2078, loss: 0.6018 2024-05-29 04:08:14,999 - mmdet - INFO - Epoch [10][4300/7330] lr: 1.000e-05, eta: 4:35:12, time: 0.936, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0376, loss_cls: 0.1523, acc: 94.2024, loss_bbox: 0.2008, loss_mask: 0.2095, loss: 0.6126 2024-05-29 04:09:01,556 - mmdet - INFO - Epoch [10][4350/7330] lr: 1.000e-05, eta: 4:34:25, time: 0.931, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0366, loss_cls: 0.1464, acc: 94.3621, loss_bbox: 0.2002, loss_mask: 0.2182, loss: 0.6128 2024-05-29 04:09:47,965 - mmdet - INFO - Epoch [10][4400/7330] lr: 1.000e-05, eta: 4:33:38, time: 0.928, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0400, loss_cls: 0.1526, acc: 94.1506, loss_bbox: 0.2054, loss_mask: 0.2169, loss: 0.6286 2024-05-29 04:10:34,593 - mmdet - INFO - Epoch [10][4450/7330] lr: 1.000e-05, eta: 4:32:52, time: 0.933, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0390, loss_cls: 0.1544, acc: 94.0642, loss_bbox: 0.2108, loss_mask: 0.2173, loss: 0.6347 2024-05-29 04:11:23,066 - mmdet - INFO - Epoch [10][4500/7330] lr: 1.000e-05, eta: 4:32:06, time: 0.969, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0361, loss_cls: 0.1446, acc: 94.4006, loss_bbox: 0.2010, loss_mask: 0.2169, loss: 0.6106 2024-05-29 04:12:09,033 - mmdet - INFO - Epoch [10][4550/7330] lr: 1.000e-05, eta: 4:31:19, time: 0.919, data_time: 0.038, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0366, loss_cls: 0.1444, acc: 94.4702, loss_bbox: 0.1942, loss_mask: 0.2142, loss: 0.6023 2024-05-29 04:13:01,591 - mmdet - INFO - Epoch [10][4600/7330] lr: 1.000e-05, eta: 4:30:33, time: 1.007, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0385, loss_cls: 0.1504, acc: 94.2041, loss_bbox: 0.2034, loss_mask: 0.2128, loss: 0.6181 2024-05-29 04:13:47,794 - mmdet - INFO - Epoch [10][4650/7330] lr: 1.000e-05, eta: 4:29:47, time: 0.968, data_time: 0.088, memory: 20925, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0359, loss_cls: 0.1406, acc: 94.5696, loss_bbox: 0.1887, loss_mask: 0.2101, loss: 0.5860 2024-05-29 04:14:35,092 - mmdet - INFO - Epoch [10][4700/7330] lr: 1.000e-05, eta: 4:29:00, time: 0.946, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0363, loss_cls: 0.1453, acc: 94.4370, loss_bbox: 0.1945, loss_mask: 0.2085, loss: 0.5961 2024-05-29 04:15:23,426 - mmdet - INFO - Epoch [10][4750/7330] lr: 1.000e-05, eta: 4:28:14, time: 0.967, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0376, loss_cls: 0.1533, acc: 94.0813, loss_bbox: 0.2056, loss_mask: 0.2128, loss: 0.6215 2024-05-29 04:16:09,038 - mmdet - INFO - Epoch [10][4800/7330] lr: 1.000e-05, eta: 4:27:27, time: 0.912, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0370, loss_cls: 0.1464, acc: 94.4761, loss_bbox: 0.2003, loss_mask: 0.2144, loss: 0.6101 2024-05-29 04:16:55,236 - mmdet - INFO - Epoch [10][4850/7330] lr: 1.000e-05, eta: 4:26:40, time: 0.924, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0383, loss_cls: 0.1499, acc: 94.2449, loss_bbox: 0.2033, loss_mask: 0.2107, loss: 0.6147 2024-05-29 04:17:44,249 - mmdet - INFO - Epoch [10][4900/7330] lr: 1.000e-05, eta: 4:25:54, time: 0.980, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0368, loss_cls: 0.1484, acc: 94.2991, loss_bbox: 0.2017, loss_mask: 0.2170, loss: 0.6167 2024-05-29 04:18:30,953 - mmdet - INFO - Epoch [10][4950/7330] lr: 1.000e-05, eta: 4:25:07, time: 0.934, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0378, loss_cls: 0.1488, acc: 94.2888, loss_bbox: 0.2018, loss_mask: 0.2110, loss: 0.6112 2024-05-29 04:19:19,599 - mmdet - INFO - Epoch [10][5000/7330] lr: 1.000e-05, eta: 4:24:21, time: 0.973, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0377, loss_cls: 0.1452, acc: 94.4832, loss_bbox: 0.1920, loss_mask: 0.2066, loss: 0.5934 2024-05-29 04:20:06,275 - mmdet - INFO - Epoch [10][5050/7330] lr: 1.000e-05, eta: 4:23:35, time: 0.934, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0365, loss_cls: 0.1453, acc: 94.4331, loss_bbox: 0.1958, loss_mask: 0.2100, loss: 0.6009 2024-05-29 04:20:55,025 - mmdet - INFO - Epoch [10][5100/7330] lr: 1.000e-05, eta: 4:22:48, time: 0.975, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0371, loss_cls: 0.1502, acc: 94.3091, loss_bbox: 0.1990, loss_mask: 0.2138, loss: 0.6131 2024-05-29 04:21:41,787 - mmdet - INFO - Epoch [10][5150/7330] lr: 1.000e-05, eta: 4:22:02, time: 0.935, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0365, loss_cls: 0.1461, acc: 94.4182, loss_bbox: 0.1980, loss_mask: 0.2066, loss: 0.5995 2024-05-29 04:22:28,067 - mmdet - INFO - Epoch [10][5200/7330] lr: 1.000e-05, eta: 4:21:15, time: 0.926, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0401, loss_cls: 0.1555, acc: 94.1064, loss_bbox: 0.2068, loss_mask: 0.2166, loss: 0.6326 2024-05-29 04:23:14,642 - mmdet - INFO - Epoch [10][5250/7330] lr: 1.000e-05, eta: 4:20:28, time: 0.931, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0391, loss_cls: 0.1495, acc: 94.3188, loss_bbox: 0.2012, loss_mask: 0.2151, loss: 0.6165 2024-05-29 04:24:03,334 - mmdet - INFO - Epoch [10][5300/7330] lr: 1.000e-05, eta: 4:19:42, time: 0.974, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0377, loss_cls: 0.1500, acc: 94.2280, loss_bbox: 0.2024, loss_mask: 0.2120, loss: 0.6145 2024-05-29 04:24:49,370 - mmdet - INFO - Epoch [10][5350/7330] lr: 1.000e-05, eta: 4:18:55, time: 0.921, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0394, loss_cls: 0.1548, acc: 94.0071, loss_bbox: 0.2095, loss_mask: 0.2189, loss: 0.6353 2024-05-29 04:25:35,374 - mmdet - INFO - Epoch [10][5400/7330] lr: 1.000e-05, eta: 4:18:08, time: 0.920, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0378, loss_cls: 0.1492, acc: 94.2261, loss_bbox: 0.2021, loss_mask: 0.2150, loss: 0.6162 2024-05-29 04:26:21,275 - mmdet - INFO - Epoch [10][5450/7330] lr: 1.000e-05, eta: 4:17:21, time: 0.918, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0351, loss_cls: 0.1429, acc: 94.4780, loss_bbox: 0.1963, loss_mask: 0.2093, loss: 0.5947 2024-05-29 04:27:07,541 - mmdet - INFO - Epoch [10][5500/7330] lr: 1.000e-05, eta: 4:16:35, time: 0.925, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0379, loss_cls: 0.1527, acc: 94.1272, loss_bbox: 0.2056, loss_mask: 0.2115, loss: 0.6187 2024-05-29 04:27:55,866 - mmdet - INFO - Epoch [10][5550/7330] lr: 1.000e-05, eta: 4:15:48, time: 0.966, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0344, loss_cls: 0.1443, acc: 94.5308, loss_bbox: 0.1916, loss_mask: 0.2071, loss: 0.5888 2024-05-29 04:28:40,986 - mmdet - INFO - Epoch [10][5600/7330] lr: 1.000e-05, eta: 4:15:01, time: 0.902, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0359, loss_cls: 0.1420, acc: 94.6465, loss_bbox: 0.1919, loss_mask: 0.2039, loss: 0.5858 2024-05-29 04:29:31,913 - mmdet - INFO - Epoch [10][5650/7330] lr: 1.000e-05, eta: 4:14:16, time: 1.019, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0363, loss_cls: 0.1522, acc: 94.2725, loss_bbox: 0.2038, loss_mask: 0.2135, loss: 0.6177 2024-05-29 04:30:18,828 - mmdet - INFO - Epoch [10][5700/7330] lr: 1.000e-05, eta: 4:13:29, time: 0.938, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0405, loss_cls: 0.1552, acc: 94.0244, loss_bbox: 0.2054, loss_mask: 0.2162, loss: 0.6297 2024-05-29 04:31:04,930 - mmdet - INFO - Epoch [10][5750/7330] lr: 1.000e-05, eta: 4:12:42, time: 0.922, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0382, loss_cls: 0.1500, acc: 94.2815, loss_bbox: 0.2018, loss_mask: 0.2123, loss: 0.6143 2024-05-29 04:31:53,828 - mmdet - INFO - Epoch [10][5800/7330] lr: 1.000e-05, eta: 4:11:56, time: 0.978, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0387, loss_cls: 0.1537, acc: 94.1296, loss_bbox: 0.2026, loss_mask: 0.2146, loss: 0.6219 2024-05-29 04:32:39,767 - mmdet - INFO - Epoch [10][5850/7330] lr: 1.000e-05, eta: 4:11:09, time: 0.919, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0369, loss_cls: 0.1468, acc: 94.4373, loss_bbox: 0.1974, loss_mask: 0.2110, loss: 0.6053 2024-05-29 04:33:26,037 - mmdet - INFO - Epoch [10][5900/7330] lr: 1.000e-05, eta: 4:10:22, time: 0.925, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0362, loss_cls: 0.1491, acc: 94.3323, loss_bbox: 0.1966, loss_mask: 0.2106, loss: 0.6051 2024-05-29 04:34:15,667 - mmdet - INFO - Epoch [10][5950/7330] lr: 1.000e-05, eta: 4:09:36, time: 0.993, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0392, loss_cls: 0.1595, acc: 93.8313, loss_bbox: 0.2093, loss_mask: 0.2136, loss: 0.6345 2024-05-29 04:35:01,826 - mmdet - INFO - Epoch [10][6000/7330] lr: 1.000e-05, eta: 4:08:50, time: 0.923, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0377, loss_cls: 0.1484, acc: 94.3286, loss_bbox: 0.1921, loss_mask: 0.2068, loss: 0.5976 2024-05-29 04:35:49,592 - mmdet - INFO - Epoch [10][6050/7330] lr: 1.000e-05, eta: 4:08:03, time: 0.955, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0343, loss_cls: 0.1393, acc: 94.7649, loss_bbox: 0.1891, loss_mask: 0.2064, loss: 0.5813 2024-05-29 04:36:36,075 - mmdet - INFO - Epoch [10][6100/7330] lr: 1.000e-05, eta: 4:07:16, time: 0.930, data_time: 0.037, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0372, loss_cls: 0.1434, acc: 94.6675, loss_bbox: 0.1897, loss_mask: 0.2058, loss: 0.5881 2024-05-29 04:37:24,781 - mmdet - INFO - Epoch [10][6150/7330] lr: 1.000e-05, eta: 4:06:30, time: 0.974, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0367, loss_cls: 0.1457, acc: 94.4177, loss_bbox: 0.1958, loss_mask: 0.2050, loss: 0.5945 2024-05-29 04:38:10,483 - mmdet - INFO - Epoch [10][6200/7330] lr: 1.000e-05, eta: 4:05:43, time: 0.914, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0360, loss_cls: 0.1413, acc: 94.5542, loss_bbox: 0.1923, loss_mask: 0.2098, loss: 0.5901 2024-05-29 04:38:56,645 - mmdet - INFO - Epoch [10][6250/7330] lr: 1.000e-05, eta: 4:04:56, time: 0.923, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0374, loss_cls: 0.1466, acc: 94.3694, loss_bbox: 0.1983, loss_mask: 0.2135, loss: 0.6081 2024-05-29 04:39:42,822 - mmdet - INFO - Epoch [10][6300/7330] lr: 1.000e-05, eta: 4:04:10, time: 0.924, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0371, loss_cls: 0.1469, acc: 94.3425, loss_bbox: 0.1986, loss_mask: 0.2106, loss: 0.6057 2024-05-29 04:40:31,109 - mmdet - INFO - Epoch [10][6350/7330] lr: 1.000e-05, eta: 4:03:23, time: 0.966, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0368, loss_cls: 0.1435, acc: 94.5010, loss_bbox: 0.1920, loss_mask: 0.2053, loss: 0.5890 2024-05-29 04:41:16,931 - mmdet - INFO - Epoch [10][6400/7330] lr: 1.000e-05, eta: 4:02:36, time: 0.916, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0336, loss_cls: 0.1434, acc: 94.5581, loss_bbox: 0.1944, loss_mask: 0.2094, loss: 0.5919 2024-05-29 04:42:02,767 - mmdet - INFO - Epoch [10][6450/7330] lr: 1.000e-05, eta: 4:01:50, time: 0.917, data_time: 0.034, memory: 20925, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0346, loss_cls: 0.1360, acc: 94.7195, loss_bbox: 0.1894, loss_mask: 0.2100, loss: 0.5805 2024-05-29 04:42:49,360 - mmdet - INFO - Epoch [10][6500/7330] lr: 1.000e-05, eta: 4:01:03, time: 0.932, data_time: 0.037, memory: 20925, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0368, loss_cls: 0.1445, acc: 94.4563, loss_bbox: 0.1941, loss_mask: 0.2094, loss: 0.5978 2024-05-29 04:43:37,882 - mmdet - INFO - Epoch [10][6550/7330] lr: 1.000e-05, eta: 4:00:17, time: 0.970, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0392, loss_cls: 0.1530, acc: 94.1562, loss_bbox: 0.2016, loss_mask: 0.2098, loss: 0.6161 2024-05-29 04:44:24,279 - mmdet - INFO - Epoch [10][6600/7330] lr: 1.000e-05, eta: 3:59:30, time: 0.928, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0378, loss_cls: 0.1492, acc: 94.2686, loss_bbox: 0.2011, loss_mask: 0.2133, loss: 0.6128 2024-05-29 04:45:10,811 - mmdet - INFO - Epoch [10][6650/7330] lr: 1.000e-05, eta: 3:58:43, time: 0.931, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0399, loss_cls: 0.1612, acc: 93.8579, loss_bbox: 0.2132, loss_mask: 0.2200, loss: 0.6482 2024-05-29 04:46:01,882 - mmdet - INFO - Epoch [10][6700/7330] lr: 1.000e-05, eta: 3:57:57, time: 1.021, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0381, loss_cls: 0.1511, acc: 94.2598, loss_bbox: 0.2044, loss_mask: 0.2140, loss: 0.6200 2024-05-29 04:46:47,743 - mmdet - INFO - Epoch [10][6750/7330] lr: 1.000e-05, eta: 3:57:10, time: 0.917, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0355, loss_cls: 0.1396, acc: 94.7000, loss_bbox: 0.1883, loss_mask: 0.2092, loss: 0.5834 2024-05-29 04:47:34,176 - mmdet - INFO - Epoch [10][6800/7330] lr: 1.000e-05, eta: 3:56:24, time: 0.929, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0386, loss_cls: 0.1523, acc: 94.1919, loss_bbox: 0.2003, loss_mask: 0.2144, loss: 0.6186 2024-05-29 04:48:23,225 - mmdet - INFO - Epoch [10][6850/7330] lr: 1.000e-05, eta: 3:55:37, time: 0.981, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0415, loss_cls: 0.1555, acc: 94.0947, loss_bbox: 0.2079, loss_mask: 0.2167, loss: 0.6343 2024-05-29 04:49:09,565 - mmdet - INFO - Epoch [10][6900/7330] lr: 1.000e-05, eta: 3:54:51, time: 0.927, data_time: 0.063, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0384, loss_cls: 0.1528, acc: 94.0706, loss_bbox: 0.2079, loss_mask: 0.2172, loss: 0.6285 2024-05-29 04:49:55,590 - mmdet - INFO - Epoch [10][6950/7330] lr: 1.000e-05, eta: 3:54:04, time: 0.920, data_time: 0.038, memory: 20925, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0350, loss_cls: 0.1429, acc: 94.4553, loss_bbox: 0.1968, loss_mask: 0.2054, loss: 0.5909 2024-05-29 04:50:44,450 - mmdet - INFO - Epoch [10][7000/7330] lr: 1.000e-05, eta: 3:53:18, time: 0.977, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0374, loss_cls: 0.1419, acc: 94.6208, loss_bbox: 0.1921, loss_mask: 0.2062, loss: 0.5898 2024-05-29 04:51:30,719 - mmdet - INFO - Epoch [10][7050/7330] lr: 1.000e-05, eta: 3:52:31, time: 0.925, data_time: 0.034, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0375, loss_cls: 0.1518, acc: 94.2334, loss_bbox: 0.1991, loss_mask: 0.2104, loss: 0.6109 2024-05-29 04:52:18,508 - mmdet - INFO - Epoch [10][7100/7330] lr: 1.000e-05, eta: 3:51:44, time: 0.956, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0341, loss_cls: 0.1475, acc: 94.3535, loss_bbox: 0.1983, loss_mask: 0.2091, loss: 0.6015 2024-05-29 04:53:05,144 - mmdet - INFO - Epoch [10][7150/7330] lr: 1.000e-05, eta: 3:50:58, time: 0.933, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0377, loss_cls: 0.1483, acc: 94.2969, loss_bbox: 0.1988, loss_mask: 0.2100, loss: 0.6061 2024-05-29 04:53:53,803 - mmdet - INFO - Epoch [10][7200/7330] lr: 1.000e-05, eta: 3:50:11, time: 0.973, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0378, loss_cls: 0.1439, acc: 94.3835, loss_bbox: 0.1938, loss_mask: 0.2094, loss: 0.5965 2024-05-29 04:54:39,615 - mmdet - INFO - Epoch [10][7250/7330] lr: 1.000e-05, eta: 3:49:25, time: 0.916, data_time: 0.038, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0359, loss_cls: 0.1459, acc: 94.3459, loss_bbox: 0.1974, loss_mask: 0.2065, loss: 0.5974 2024-05-29 04:55:25,876 - mmdet - INFO - Epoch [10][7300/7330] lr: 1.000e-05, eta: 3:48:38, time: 0.925, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0354, loss_cls: 0.1489, acc: 94.2749, loss_bbox: 0.1993, loss_mask: 0.2087, loss: 0.6046 2024-05-29 04:55:54,034 - mmdet - INFO - Saving checkpoint at 10 epochs 2024-05-29 04:58:03,203 - mmdet - INFO - Evaluating bbox... 2024-05-29 04:58:22,682 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.497 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.726 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.547 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.317 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.538 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.666 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.610 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.610 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.610 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.424 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.650 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.770 2024-05-29 04:58:22,683 - mmdet - INFO - Evaluating segm... 2024-05-29 04:58:44,581 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.444 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.693 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.477 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.229 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.480 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.658 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.549 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.549 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.549 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.352 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.594 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.732 2024-05-29 04:58:44,871 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 04:58:44,873 - mmdet - INFO - Epoch(val) [10][625] bbox_mAP: 0.4970, bbox_mAP_50: 0.7260, bbox_mAP_75: 0.5470, bbox_mAP_s: 0.3170, bbox_mAP_m: 0.5380, bbox_mAP_l: 0.6660, bbox_mAP_copypaste: 0.497 0.726 0.547 0.317 0.538 0.666, segm_mAP: 0.4440, segm_mAP_50: 0.6930, segm_mAP_75: 0.4770, segm_mAP_s: 0.2290, segm_mAP_m: 0.4800, segm_mAP_l: 0.6580, segm_mAP_copypaste: 0.444 0.693 0.477 0.229 0.480 0.658 2024-05-29 04:59:40,556 - mmdet - INFO - Epoch [11][50/7330] lr: 1.000e-05, eta: 3:47:19, time: 1.113, data_time: 0.112, memory: 20925, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0348, loss_cls: 0.1366, acc: 94.8074, loss_bbox: 0.1846, loss_mask: 0.2024, loss: 0.5680 2024-05-29 05:00:29,087 - mmdet - INFO - Epoch [11][100/7330] lr: 1.000e-05, eta: 3:46:33, time: 0.971, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0375, loss_cls: 0.1479, acc: 94.2742, loss_bbox: 0.1990, loss_mask: 0.2113, loss: 0.6069 2024-05-29 05:01:17,891 - mmdet - INFO - Epoch [11][150/7330] lr: 1.000e-05, eta: 3:45:47, time: 0.976, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0388, loss_cls: 0.1528, acc: 94.1196, loss_bbox: 0.2074, loss_mask: 0.2095, loss: 0.6217 2024-05-29 05:02:03,944 - mmdet - INFO - Epoch [11][200/7330] lr: 1.000e-05, eta: 3:45:00, time: 0.921, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0371, loss_cls: 0.1469, acc: 94.3245, loss_bbox: 0.1998, loss_mask: 0.2076, loss: 0.6038 2024-05-29 05:02:49,676 - mmdet - INFO - Epoch [11][250/7330] lr: 1.000e-05, eta: 3:44:13, time: 0.915, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0392, loss_cls: 0.1477, acc: 94.2600, loss_bbox: 0.1998, loss_mask: 0.2049, loss: 0.6030 2024-05-29 05:03:36,144 - mmdet - INFO - Epoch [11][300/7330] lr: 1.000e-05, eta: 3:43:26, time: 0.929, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0392, loss_cls: 0.1483, acc: 94.3418, loss_bbox: 0.2011, loss_mask: 0.2134, loss: 0.6145 2024-05-29 05:04:22,105 - mmdet - INFO - Epoch [11][350/7330] lr: 1.000e-05, eta: 3:42:39, time: 0.919, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0354, loss_cls: 0.1421, acc: 94.6030, loss_bbox: 0.1907, loss_mask: 0.2041, loss: 0.5833 2024-05-29 05:05:08,237 - mmdet - INFO - Epoch [11][400/7330] lr: 1.000e-05, eta: 3:41:53, time: 0.923, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0365, loss_cls: 0.1440, acc: 94.3901, loss_bbox: 0.1956, loss_mask: 0.2090, loss: 0.5963 2024-05-29 05:05:53,795 - mmdet - INFO - Epoch [11][450/7330] lr: 1.000e-05, eta: 3:41:06, time: 0.911, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0357, loss_cls: 0.1439, acc: 94.4810, loss_bbox: 0.1935, loss_mask: 0.2042, loss: 0.5895 2024-05-29 05:06:39,720 - mmdet - INFO - Epoch [11][500/7330] lr: 1.000e-05, eta: 3:40:19, time: 0.918, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0358, loss_cls: 0.1459, acc: 94.3257, loss_bbox: 0.1946, loss_mask: 0.2087, loss: 0.5966 2024-05-29 05:07:25,147 - mmdet - INFO - Epoch [11][550/7330] lr: 1.000e-05, eta: 3:39:32, time: 0.909, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0383, loss_cls: 0.1475, acc: 94.3328, loss_bbox: 0.1974, loss_mask: 0.2061, loss: 0.6017 2024-05-29 05:08:11,321 - mmdet - INFO - Epoch [11][600/7330] lr: 1.000e-05, eta: 3:38:45, time: 0.923, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0364, loss_cls: 0.1485, acc: 94.3062, loss_bbox: 0.2004, loss_mask: 0.2151, loss: 0.6114 2024-05-29 05:09:00,176 - mmdet - INFO - Epoch [11][650/7330] lr: 1.000e-05, eta: 3:37:59, time: 0.977, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0382, loss_cls: 0.1446, acc: 94.4331, loss_bbox: 0.1964, loss_mask: 0.2079, loss: 0.5984 2024-05-29 05:09:46,188 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 05:09:46,188 - mmdet - INFO - Epoch [11][700/7330] lr: 1.000e-05, eta: 3:37:12, time: 0.920, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0358, loss_cls: 0.1439, acc: 94.4768, loss_bbox: 0.1981, loss_mask: 0.2095, loss: 0.5987 2024-05-29 05:10:33,381 - mmdet - INFO - Epoch [11][750/7330] lr: 1.000e-05, eta: 3:36:26, time: 0.944, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0372, loss_cls: 0.1460, acc: 94.4385, loss_bbox: 0.1954, loss_mask: 0.2075, loss: 0.5979 2024-05-29 05:11:18,814 - mmdet - INFO - Epoch [11][800/7330] lr: 1.000e-05, eta: 3:35:39, time: 0.909, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0394, loss_cls: 0.1468, acc: 94.4009, loss_bbox: 0.1979, loss_mask: 0.2094, loss: 0.6053 2024-05-29 05:12:04,445 - mmdet - INFO - Epoch [11][850/7330] lr: 1.000e-05, eta: 3:34:52, time: 0.913, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0327, loss_cls: 0.1420, acc: 94.5127, loss_bbox: 0.1909, loss_mask: 0.2060, loss: 0.5824 2024-05-29 05:12:50,497 - mmdet - INFO - Epoch [11][900/7330] lr: 1.000e-05, eta: 3:34:05, time: 0.921, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0374, loss_cls: 0.1398, acc: 94.5417, loss_bbox: 0.1906, loss_mask: 0.2050, loss: 0.5828 2024-05-29 05:13:38,578 - mmdet - INFO - Epoch [11][950/7330] lr: 1.000e-05, eta: 3:33:19, time: 0.961, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0356, loss_cls: 0.1366, acc: 94.7747, loss_bbox: 0.1884, loss_mask: 0.2088, loss: 0.5804 2024-05-29 05:14:24,040 - mmdet - INFO - Epoch [11][1000/7330] lr: 1.000e-05, eta: 3:32:32, time: 0.910, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0342, loss_cls: 0.1329, acc: 94.8032, loss_bbox: 0.1839, loss_mask: 0.2063, loss: 0.5677 2024-05-29 05:15:12,427 - mmdet - INFO - Epoch [11][1050/7330] lr: 1.000e-05, eta: 3:31:45, time: 0.968, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0350, loss_cls: 0.1439, acc: 94.4441, loss_bbox: 0.1961, loss_mask: 0.2112, loss: 0.5969 2024-05-29 05:15:58,560 - mmdet - INFO - Epoch [11][1100/7330] lr: 1.000e-05, eta: 3:30:59, time: 0.923, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0389, loss_cls: 0.1521, acc: 94.2178, loss_bbox: 0.2053, loss_mask: 0.2134, loss: 0.6219 2024-05-29 05:16:48,904 - mmdet - INFO - Epoch [11][1150/7330] lr: 1.000e-05, eta: 3:30:13, time: 1.007, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0369, loss_cls: 0.1433, acc: 94.4089, loss_bbox: 0.2027, loss_mask: 0.2107, loss: 0.6049 2024-05-29 05:17:35,563 - mmdet - INFO - Epoch [11][1200/7330] lr: 1.000e-05, eta: 3:29:26, time: 0.933, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0352, loss_cls: 0.1429, acc: 94.4915, loss_bbox: 0.1935, loss_mask: 0.2085, loss: 0.5913 2024-05-29 05:18:22,636 - mmdet - INFO - Epoch [11][1250/7330] lr: 1.000e-05, eta: 3:28:39, time: 0.941, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0403, loss_cls: 0.1507, acc: 94.1772, loss_bbox: 0.2033, loss_mask: 0.2122, loss: 0.6188 2024-05-29 05:19:14,164 - mmdet - INFO - Epoch [11][1300/7330] lr: 1.000e-05, eta: 3:27:53, time: 1.031, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0397, loss_cls: 0.1525, acc: 94.0930, loss_bbox: 0.2035, loss_mask: 0.2109, loss: 0.6195 2024-05-29 05:20:00,097 - mmdet - INFO - Epoch [11][1350/7330] lr: 1.000e-05, eta: 3:27:07, time: 0.919, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0363, loss_cls: 0.1426, acc: 94.4512, loss_bbox: 0.1944, loss_mask: 0.2067, loss: 0.5911 2024-05-29 05:20:50,112 - mmdet - INFO - Epoch [11][1400/7330] lr: 1.000e-05, eta: 3:26:21, time: 1.000, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0392, loss_cls: 0.1469, acc: 94.3755, loss_bbox: 0.2006, loss_mask: 0.2116, loss: 0.6104 2024-05-29 05:21:35,637 - mmdet - INFO - Epoch [11][1450/7330] lr: 1.000e-05, eta: 3:25:34, time: 0.911, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0354, loss_cls: 0.1419, acc: 94.5352, loss_bbox: 0.1944, loss_mask: 0.2111, loss: 0.5945 2024-05-29 05:22:22,317 - mmdet - INFO - Epoch [11][1500/7330] lr: 1.000e-05, eta: 3:24:47, time: 0.934, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0389, loss_cls: 0.1554, acc: 94.0330, loss_bbox: 0.2083, loss_mask: 0.2140, loss: 0.6295 2024-05-29 05:23:08,269 - mmdet - INFO - Epoch [11][1550/7330] lr: 1.000e-05, eta: 3:24:00, time: 0.919, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0375, loss_cls: 0.1410, acc: 94.5996, loss_bbox: 0.1930, loss_mask: 0.2080, loss: 0.5911 2024-05-29 05:23:53,956 - mmdet - INFO - Epoch [11][1600/7330] lr: 1.000e-05, eta: 3:23:13, time: 0.914, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0370, loss_cls: 0.1483, acc: 94.3235, loss_bbox: 0.2064, loss_mask: 0.2116, loss: 0.6148 2024-05-29 05:24:40,242 - mmdet - INFO - Epoch [11][1650/7330] lr: 1.000e-05, eta: 3:22:27, time: 0.926, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0385, loss_cls: 0.1512, acc: 94.2900, loss_bbox: 0.2021, loss_mask: 0.2124, loss: 0.6159 2024-05-29 05:25:29,509 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 05:25:29,509 - mmdet - INFO - Epoch [11][1700/7330] lr: 1.000e-05, eta: 3:21:40, time: 0.985, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0380, loss_cls: 0.1516, acc: 94.1924, loss_bbox: 0.2035, loss_mask: 0.2105, loss: 0.6144 2024-05-29 05:26:15,583 - mmdet - INFO - Epoch [11][1750/7330] lr: 1.000e-05, eta: 3:20:54, time: 0.922, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0381, loss_cls: 0.1512, acc: 94.2063, loss_bbox: 0.2049, loss_mask: 0.2133, loss: 0.6207 2024-05-29 05:27:01,864 - mmdet - INFO - Epoch [11][1800/7330] lr: 1.000e-05, eta: 3:20:07, time: 0.926, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0374, loss_cls: 0.1442, acc: 94.4202, loss_bbox: 0.1962, loss_mask: 0.2095, loss: 0.5999 2024-05-29 05:27:47,686 - mmdet - INFO - Epoch [11][1850/7330] lr: 1.000e-05, eta: 3:19:20, time: 0.916, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0366, loss_cls: 0.1460, acc: 94.3916, loss_bbox: 0.1970, loss_mask: 0.2102, loss: 0.6015 2024-05-29 05:28:33,330 - mmdet - INFO - Epoch [11][1900/7330] lr: 1.000e-05, eta: 3:18:33, time: 0.913, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0376, loss_cls: 0.1488, acc: 94.2922, loss_bbox: 0.2053, loss_mask: 0.2148, loss: 0.6185 2024-05-29 05:29:19,712 - mmdet - INFO - Epoch [11][1950/7330] lr: 1.000e-05, eta: 3:17:46, time: 0.928, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0382, loss_cls: 0.1518, acc: 94.0935, loss_bbox: 0.2073, loss_mask: 0.2168, loss: 0.6259 2024-05-29 05:30:08,767 - mmdet - INFO - Epoch [11][2000/7330] lr: 1.000e-05, eta: 3:17:00, time: 0.981, data_time: 0.070, memory: 20925, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0382, loss_cls: 0.1499, acc: 94.2351, loss_bbox: 0.1939, loss_mask: 0.2050, loss: 0.5980 2024-05-29 05:30:54,298 - mmdet - INFO - Epoch [11][2050/7330] lr: 1.000e-05, eta: 3:16:13, time: 0.911, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0370, loss_cls: 0.1483, acc: 94.3279, loss_bbox: 0.2014, loss_mask: 0.2123, loss: 0.6105 2024-05-29 05:31:42,612 - mmdet - INFO - Epoch [11][2100/7330] lr: 1.000e-05, eta: 3:15:27, time: 0.966, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0371, loss_cls: 0.1443, acc: 94.5015, loss_bbox: 0.1985, loss_mask: 0.2124, loss: 0.6030 2024-05-29 05:32:28,816 - mmdet - INFO - Epoch [11][2150/7330] lr: 1.000e-05, eta: 3:14:40, time: 0.924, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0380, loss_cls: 0.1504, acc: 94.2571, loss_bbox: 0.2028, loss_mask: 0.2123, loss: 0.6151 2024-05-29 05:33:21,227 - mmdet - INFO - Epoch [11][2200/7330] lr: 1.000e-05, eta: 3:13:54, time: 1.048, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0370, loss_cls: 0.1461, acc: 94.3904, loss_bbox: 0.1960, loss_mask: 0.2058, loss: 0.5972 2024-05-29 05:34:07,042 - mmdet - INFO - Epoch [11][2250/7330] lr: 1.000e-05, eta: 3:13:07, time: 0.916, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0375, loss_cls: 0.1534, acc: 94.1509, loss_bbox: 0.2041, loss_mask: 0.2135, loss: 0.6195 2024-05-29 05:34:52,606 - mmdet - INFO - Epoch [11][2300/7330] lr: 1.000e-05, eta: 3:12:21, time: 0.911, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0372, loss_cls: 0.1444, acc: 94.4326, loss_bbox: 0.1935, loss_mask: 0.2086, loss: 0.5961 2024-05-29 05:35:43,156 - mmdet - INFO - Epoch [11][2350/7330] lr: 1.000e-05, eta: 3:11:34, time: 1.011, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0378, loss_cls: 0.1451, acc: 94.4067, loss_bbox: 0.2000, loss_mask: 0.2123, loss: 0.6082 2024-05-29 05:36:29,310 - mmdet - INFO - Epoch [11][2400/7330] lr: 1.000e-05, eta: 3:10:48, time: 0.923, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0385, loss_cls: 0.1540, acc: 94.0845, loss_bbox: 0.2052, loss_mask: 0.2137, loss: 0.6231 2024-05-29 05:37:17,849 - mmdet - INFO - Epoch [11][2450/7330] lr: 1.000e-05, eta: 3:10:01, time: 0.971, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0358, loss_cls: 0.1405, acc: 94.6667, loss_bbox: 0.1913, loss_mask: 0.2104, loss: 0.5886 2024-05-29 05:38:04,383 - mmdet - INFO - Epoch [11][2500/7330] lr: 1.000e-05, eta: 3:09:15, time: 0.931, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0370, loss_cls: 0.1480, acc: 94.2305, loss_bbox: 0.2009, loss_mask: 0.2102, loss: 0.6072 2024-05-29 05:38:51,159 - mmdet - INFO - Epoch [11][2550/7330] lr: 1.000e-05, eta: 3:08:28, time: 0.935, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0370, loss_cls: 0.1538, acc: 94.0208, loss_bbox: 0.2041, loss_mask: 0.2112, loss: 0.6190 2024-05-29 05:39:37,166 - mmdet - INFO - Epoch [11][2600/7330] lr: 1.000e-05, eta: 3:07:41, time: 0.920, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0351, loss_cls: 0.1434, acc: 94.4961, loss_bbox: 0.1930, loss_mask: 0.2081, loss: 0.5912 2024-05-29 05:40:24,101 - mmdet - INFO - Epoch [11][2650/7330] lr: 1.000e-05, eta: 3:06:54, time: 0.939, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0362, loss_cls: 0.1438, acc: 94.4705, loss_bbox: 0.1967, loss_mask: 0.2114, loss: 0.5996 2024-05-29 05:41:09,856 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 05:41:09,857 - mmdet - INFO - Epoch [11][2700/7330] lr: 1.000e-05, eta: 3:06:08, time: 0.915, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0356, loss_cls: 0.1391, acc: 94.6912, loss_bbox: 0.1890, loss_mask: 0.2049, loss: 0.5802 2024-05-29 05:41:58,464 - mmdet - INFO - Epoch [11][2750/7330] lr: 1.000e-05, eta: 3:05:21, time: 0.972, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0368, loss_cls: 0.1442, acc: 94.4622, loss_bbox: 0.1963, loss_mask: 0.2055, loss: 0.5934 2024-05-29 05:42:44,159 - mmdet - INFO - Epoch [11][2800/7330] lr: 1.000e-05, eta: 3:04:34, time: 0.914, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0370, loss_cls: 0.1413, acc: 94.6118, loss_bbox: 0.1926, loss_mask: 0.2037, loss: 0.5854 2024-05-29 05:43:30,041 - mmdet - INFO - Epoch [11][2850/7330] lr: 1.000e-05, eta: 3:03:48, time: 0.918, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0358, loss_cls: 0.1388, acc: 94.5947, loss_bbox: 0.1914, loss_mask: 0.2078, loss: 0.5846 2024-05-29 05:44:15,662 - mmdet - INFO - Epoch [11][2900/7330] lr: 1.000e-05, eta: 3:03:01, time: 0.912, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0363, loss_cls: 0.1472, acc: 94.3674, loss_bbox: 0.1980, loss_mask: 0.2058, loss: 0.5982 2024-05-29 05:45:01,588 - mmdet - INFO - Epoch [11][2950/7330] lr: 1.000e-05, eta: 3:02:14, time: 0.919, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0362, loss_cls: 0.1456, acc: 94.2961, loss_bbox: 0.1983, loss_mask: 0.2077, loss: 0.5987 2024-05-29 05:45:47,939 - mmdet - INFO - Epoch [11][3000/7330] lr: 1.000e-05, eta: 3:01:27, time: 0.927, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0367, loss_cls: 0.1482, acc: 94.3696, loss_bbox: 0.1976, loss_mask: 0.2085, loss: 0.6039 2024-05-29 05:46:38,447 - mmdet - INFO - Epoch [11][3050/7330] lr: 1.000e-05, eta: 3:00:41, time: 1.010, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0347, loss_cls: 0.1415, acc: 94.5840, loss_bbox: 0.1935, loss_mask: 0.2050, loss: 0.5857 2024-05-29 05:47:24,117 - mmdet - INFO - Epoch [11][3100/7330] lr: 1.000e-05, eta: 2:59:54, time: 0.913, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0348, loss_cls: 0.1417, acc: 94.4934, loss_bbox: 0.1944, loss_mask: 0.2071, loss: 0.5887 2024-05-29 05:48:12,502 - mmdet - INFO - Epoch [11][3150/7330] lr: 1.000e-05, eta: 2:59:08, time: 0.968, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0369, loss_cls: 0.1494, acc: 94.2852, loss_bbox: 0.2012, loss_mask: 0.2094, loss: 0.6094 2024-05-29 05:48:58,230 - mmdet - INFO - Epoch [11][3200/7330] lr: 1.000e-05, eta: 2:58:21, time: 0.915, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0367, loss_cls: 0.1466, acc: 94.3689, loss_bbox: 0.1946, loss_mask: 0.2063, loss: 0.5961 2024-05-29 05:49:49,804 - mmdet - INFO - Epoch [11][3250/7330] lr: 1.000e-05, eta: 2:57:35, time: 1.031, data_time: 0.063, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0399, loss_cls: 0.1504, acc: 94.1824, loss_bbox: 0.2055, loss_mask: 0.2144, loss: 0.6220 2024-05-29 05:50:35,821 - mmdet - INFO - Epoch [11][3300/7330] lr: 1.000e-05, eta: 2:56:48, time: 0.920, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0345, loss_cls: 0.1374, acc: 94.7048, loss_bbox: 0.1870, loss_mask: 0.2066, loss: 0.5755 2024-05-29 05:51:21,875 - mmdet - INFO - Epoch [11][3350/7330] lr: 1.000e-05, eta: 2:56:01, time: 0.921, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0363, loss_cls: 0.1472, acc: 94.3638, loss_bbox: 0.2007, loss_mask: 0.2121, loss: 0.6078 2024-05-29 05:52:12,874 - mmdet - INFO - Epoch [11][3400/7330] lr: 1.000e-05, eta: 2:55:15, time: 1.020, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0371, loss_cls: 0.1466, acc: 94.4114, loss_bbox: 0.1984, loss_mask: 0.2128, loss: 0.6067 2024-05-29 05:52:58,929 - mmdet - INFO - Epoch [11][3450/7330] lr: 1.000e-05, eta: 2:54:29, time: 0.921, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0360, loss_cls: 0.1436, acc: 94.4419, loss_bbox: 0.1931, loss_mask: 0.2103, loss: 0.5946 2024-05-29 05:53:47,896 - mmdet - INFO - Epoch [11][3500/7330] lr: 1.000e-05, eta: 2:53:42, time: 0.979, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0379, loss_cls: 0.1537, acc: 94.0842, loss_bbox: 0.2060, loss_mask: 0.2120, loss: 0.6210 2024-05-29 05:54:33,648 - mmdet - INFO - Epoch [11][3550/7330] lr: 1.000e-05, eta: 2:52:55, time: 0.915, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0370, loss_cls: 0.1443, acc: 94.4641, loss_bbox: 0.1970, loss_mask: 0.2113, loss: 0.6021 2024-05-29 05:55:19,726 - mmdet - INFO - Epoch [11][3600/7330] lr: 1.000e-05, eta: 2:52:09, time: 0.922, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0387, loss_cls: 0.1530, acc: 94.1177, loss_bbox: 0.2034, loss_mask: 0.2112, loss: 0.6188 2024-05-29 05:56:05,785 - mmdet - INFO - Epoch [11][3650/7330] lr: 1.000e-05, eta: 2:51:22, time: 0.921, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0404, loss_cls: 0.1571, acc: 94.0010, loss_bbox: 0.2047, loss_mask: 0.2153, loss: 0.6298 2024-05-29 05:56:51,443 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 05:56:51,444 - mmdet - INFO - Epoch [11][3700/7330] lr: 1.000e-05, eta: 2:50:35, time: 0.913, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0351, loss_cls: 0.1375, acc: 94.7061, loss_bbox: 0.1884, loss_mask: 0.2055, loss: 0.5773 2024-05-29 05:57:37,148 - mmdet - INFO - Epoch [11][3750/7330] lr: 1.000e-05, eta: 2:49:48, time: 0.914, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0374, loss_cls: 0.1501, acc: 94.2415, loss_bbox: 0.2019, loss_mask: 0.2140, loss: 0.6158 2024-05-29 05:58:26,787 - mmdet - INFO - Epoch [11][3800/7330] lr: 1.000e-05, eta: 2:49:02, time: 0.993, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0387, loss_cls: 0.1455, acc: 94.3647, loss_bbox: 0.2026, loss_mask: 0.2097, loss: 0.6078 2024-05-29 05:59:13,207 - mmdet - INFO - Epoch [11][3850/7330] lr: 1.000e-05, eta: 2:48:15, time: 0.928, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0373, loss_cls: 0.1485, acc: 94.2913, loss_bbox: 0.2005, loss_mask: 0.2128, loss: 0.6116 2024-05-29 05:59:59,585 - mmdet - INFO - Epoch [11][3900/7330] lr: 1.000e-05, eta: 2:47:28, time: 0.928, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0386, loss_cls: 0.1537, acc: 94.0825, loss_bbox: 0.2057, loss_mask: 0.2149, loss: 0.6261 2024-05-29 06:00:45,243 - mmdet - INFO - Epoch [11][3950/7330] lr: 1.000e-05, eta: 2:46:42, time: 0.913, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0377, loss_cls: 0.1467, acc: 94.3655, loss_bbox: 0.1992, loss_mask: 0.2143, loss: 0.6092 2024-05-29 06:01:31,187 - mmdet - INFO - Epoch [11][4000/7330] lr: 1.000e-05, eta: 2:45:55, time: 0.919, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0394, loss_cls: 0.1469, acc: 94.3374, loss_bbox: 0.1931, loss_mask: 0.2097, loss: 0.6010 2024-05-29 06:02:17,614 - mmdet - INFO - Epoch [11][4050/7330] lr: 1.000e-05, eta: 2:45:08, time: 0.928, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0378, loss_cls: 0.1462, acc: 94.4001, loss_bbox: 0.1950, loss_mask: 0.2100, loss: 0.6021 2024-05-29 06:03:06,416 - mmdet - INFO - Epoch [11][4100/7330] lr: 1.000e-05, eta: 2:44:22, time: 0.976, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0364, loss_cls: 0.1403, acc: 94.5520, loss_bbox: 0.1929, loss_mask: 0.2094, loss: 0.5897 2024-05-29 06:03:52,605 - mmdet - INFO - Epoch [11][4150/7330] lr: 1.000e-05, eta: 2:43:35, time: 0.924, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0397, loss_cls: 0.1520, acc: 94.1921, loss_bbox: 0.2050, loss_mask: 0.2110, loss: 0.6193 2024-05-29 06:04:42,604 - mmdet - INFO - Epoch [11][4200/7330] lr: 1.000e-05, eta: 2:42:49, time: 1.000, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0377, loss_cls: 0.1501, acc: 94.2083, loss_bbox: 0.2012, loss_mask: 0.2095, loss: 0.6111 2024-05-29 06:05:28,729 - mmdet - INFO - Epoch [11][4250/7330] lr: 1.000e-05, eta: 2:42:02, time: 0.922, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0361, loss_cls: 0.1460, acc: 94.3606, loss_bbox: 0.1989, loss_mask: 0.2081, loss: 0.6003 2024-05-29 06:06:21,470 - mmdet - INFO - Epoch [11][4300/7330] lr: 1.000e-05, eta: 2:41:16, time: 1.055, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0371, loss_cls: 0.1459, acc: 94.3728, loss_bbox: 0.2027, loss_mask: 0.2077, loss: 0.6036 2024-05-29 06:07:07,624 - mmdet - INFO - Epoch [11][4350/7330] lr: 1.000e-05, eta: 2:40:29, time: 0.923, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0366, loss_cls: 0.1471, acc: 94.3333, loss_bbox: 0.2006, loss_mask: 0.2106, loss: 0.6066 2024-05-29 06:07:54,109 - mmdet - INFO - Epoch [11][4400/7330] lr: 1.000e-05, eta: 2:39:42, time: 0.930, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0393, loss_cls: 0.1514, acc: 94.2014, loss_bbox: 0.2053, loss_mask: 0.2191, loss: 0.6276 2024-05-29 06:08:44,385 - mmdet - INFO - Epoch [11][4450/7330] lr: 1.000e-05, eta: 2:38:56, time: 1.006, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0372, loss_cls: 0.1429, acc: 94.4919, loss_bbox: 0.1981, loss_mask: 0.2077, loss: 0.5970 2024-05-29 06:09:30,250 - mmdet - INFO - Epoch [11][4500/7330] lr: 1.000e-05, eta: 2:38:09, time: 0.917, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0341, loss_cls: 0.1457, acc: 94.3577, loss_bbox: 0.1939, loss_mask: 0.2032, loss: 0.5884 2024-05-29 06:10:19,040 - mmdet - INFO - Epoch [11][4550/7330] lr: 1.000e-05, eta: 2:37:23, time: 0.976, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0372, loss_cls: 0.1475, acc: 94.3318, loss_bbox: 0.1993, loss_mask: 0.2115, loss: 0.6077 2024-05-29 06:11:05,011 - mmdet - INFO - Epoch [11][4600/7330] lr: 1.000e-05, eta: 2:36:36, time: 0.919, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0357, loss_cls: 0.1381, acc: 94.6780, loss_bbox: 0.1865, loss_mask: 0.2072, loss: 0.5790 2024-05-29 06:11:51,454 - mmdet - INFO - Epoch [11][4650/7330] lr: 1.000e-05, eta: 2:35:49, time: 0.929, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0371, loss_cls: 0.1463, acc: 94.3481, loss_bbox: 0.1959, loss_mask: 0.2078, loss: 0.5988 2024-05-29 06:12:36,807 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 06:12:36,807 - mmdet - INFO - Epoch [11][4700/7330] lr: 1.000e-05, eta: 2:35:03, time: 0.907, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0379, loss_cls: 0.1444, acc: 94.4492, loss_bbox: 0.1951, loss_mask: 0.2045, loss: 0.5932 2024-05-29 06:13:22,995 - mmdet - INFO - Epoch [11][4750/7330] lr: 1.000e-05, eta: 2:34:16, time: 0.924, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0358, loss_cls: 0.1447, acc: 94.4463, loss_bbox: 0.1993, loss_mask: 0.2071, loss: 0.5985 2024-05-29 06:14:09,125 - mmdet - INFO - Epoch [11][4800/7330] lr: 1.000e-05, eta: 2:33:29, time: 0.923, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0363, loss_cls: 0.1461, acc: 94.3767, loss_bbox: 0.1987, loss_mask: 0.2085, loss: 0.6006 2024-05-29 06:14:57,041 - mmdet - INFO - Epoch [11][4850/7330] lr: 1.000e-05, eta: 2:32:42, time: 0.958, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0356, loss_cls: 0.1415, acc: 94.5876, loss_bbox: 0.1904, loss_mask: 0.2084, loss: 0.5871 2024-05-29 06:15:42,831 - mmdet - INFO - Epoch [11][4900/7330] lr: 1.000e-05, eta: 2:31:56, time: 0.916, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0335, loss_cls: 0.1402, acc: 94.6440, loss_bbox: 0.1921, loss_mask: 0.2054, loss: 0.5815 2024-05-29 06:16:28,107 - mmdet - INFO - Epoch [11][4950/7330] lr: 1.000e-05, eta: 2:31:09, time: 0.906, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0351, loss_cls: 0.1399, acc: 94.5994, loss_bbox: 0.1851, loss_mask: 0.2068, loss: 0.5786 2024-05-29 06:17:14,210 - mmdet - INFO - Epoch [11][5000/7330] lr: 1.000e-05, eta: 2:30:22, time: 0.922, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0367, loss_cls: 0.1476, acc: 94.3779, loss_bbox: 0.1981, loss_mask: 0.2117, loss: 0.6059 2024-05-29 06:18:00,217 - mmdet - INFO - Epoch [11][5050/7330] lr: 1.000e-05, eta: 2:29:35, time: 0.921, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0373, loss_cls: 0.1479, acc: 94.3811, loss_bbox: 0.1966, loss_mask: 0.2078, loss: 0.6013 2024-05-29 06:18:45,991 - mmdet - INFO - Epoch [11][5100/7330] lr: 1.000e-05, eta: 2:28:48, time: 0.915, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0328, loss_cls: 0.1392, acc: 94.6665, loss_bbox: 0.1903, loss_mask: 0.2064, loss: 0.5798 2024-05-29 06:19:34,216 - mmdet - INFO - Epoch [11][5150/7330] lr: 1.000e-05, eta: 2:28:02, time: 0.964, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0357, loss_cls: 0.1435, acc: 94.6177, loss_bbox: 0.1933, loss_mask: 0.2068, loss: 0.5903 2024-05-29 06:20:20,013 - mmdet - INFO - Epoch [11][5200/7330] lr: 1.000e-05, eta: 2:27:15, time: 0.916, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0360, loss_cls: 0.1473, acc: 94.3411, loss_bbox: 0.1970, loss_mask: 0.2133, loss: 0.6052 2024-05-29 06:21:08,675 - mmdet - INFO - Epoch [11][5250/7330] lr: 1.000e-05, eta: 2:26:29, time: 0.973, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0385, loss_cls: 0.1505, acc: 94.2358, loss_bbox: 0.2040, loss_mask: 0.2114, loss: 0.6156 2024-05-29 06:21:54,718 - mmdet - INFO - Epoch [11][5300/7330] lr: 1.000e-05, eta: 2:25:42, time: 0.921, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0365, loss_cls: 0.1445, acc: 94.4668, loss_bbox: 0.1959, loss_mask: 0.2072, loss: 0.5953 2024-05-29 06:22:44,430 - mmdet - INFO - Epoch [11][5350/7330] lr: 1.000e-05, eta: 2:24:55, time: 0.994, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0342, loss_cls: 0.1401, acc: 94.5603, loss_bbox: 0.1943, loss_mask: 0.2137, loss: 0.5936 2024-05-29 06:23:30,402 - mmdet - INFO - Epoch [11][5400/7330] lr: 1.000e-05, eta: 2:24:09, time: 0.919, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0365, loss_cls: 0.1489, acc: 94.2827, loss_bbox: 0.1984, loss_mask: 0.2120, loss: 0.6077 2024-05-29 06:24:16,616 - mmdet - INFO - Epoch [11][5450/7330] lr: 1.000e-05, eta: 2:23:22, time: 0.924, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0371, loss_cls: 0.1459, acc: 94.3430, loss_bbox: 0.1994, loss_mask: 0.2131, loss: 0.6065 2024-05-29 06:25:07,750 - mmdet - INFO - Epoch [11][5500/7330] lr: 1.000e-05, eta: 2:22:36, time: 1.023, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0392, loss_cls: 0.1513, acc: 94.1392, loss_bbox: 0.2078, loss_mask: 0.2150, loss: 0.6242 2024-05-29 06:25:53,317 - mmdet - INFO - Epoch [11][5550/7330] lr: 1.000e-05, eta: 2:21:49, time: 0.911, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0367, loss_cls: 0.1492, acc: 94.2595, loss_bbox: 0.2020, loss_mask: 0.2147, loss: 0.6148 2024-05-29 06:26:41,401 - mmdet - INFO - Epoch [11][5600/7330] lr: 1.000e-05, eta: 2:21:02, time: 0.962, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0371, loss_cls: 0.1474, acc: 94.2502, loss_bbox: 0.2004, loss_mask: 0.2163, loss: 0.6125 2024-05-29 06:27:27,291 - mmdet - INFO - Epoch [11][5650/7330] lr: 1.000e-05, eta: 2:20:16, time: 0.918, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0357, loss_cls: 0.1391, acc: 94.6956, loss_bbox: 0.1919, loss_mask: 0.2094, loss: 0.5870 2024-05-29 06:28:13,521 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 06:28:13,521 - mmdet - INFO - Epoch [11][5700/7330] lr: 1.000e-05, eta: 2:19:29, time: 0.925, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0383, loss_cls: 0.1491, acc: 94.3435, loss_bbox: 0.2029, loss_mask: 0.2139, loss: 0.6167 2024-05-29 06:28:59,550 - mmdet - INFO - Epoch [11][5750/7330] lr: 1.000e-05, eta: 2:18:42, time: 0.921, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0373, loss_cls: 0.1520, acc: 94.0974, loss_bbox: 0.2034, loss_mask: 0.2149, loss: 0.6195 2024-05-29 06:29:45,513 - mmdet - INFO - Epoch [11][5800/7330] lr: 1.000e-05, eta: 2:17:55, time: 0.919, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0378, loss_cls: 0.1523, acc: 94.1763, loss_bbox: 0.1990, loss_mask: 0.2086, loss: 0.6103 2024-05-29 06:30:31,075 - mmdet - INFO - Epoch [11][5850/7330] lr: 1.000e-05, eta: 2:17:08, time: 0.911, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0382, loss_cls: 0.1529, acc: 94.1299, loss_bbox: 0.2040, loss_mask: 0.2139, loss: 0.6215 2024-05-29 06:31:18,769 - mmdet - INFO - Epoch [11][5900/7330] lr: 1.000e-05, eta: 2:16:22, time: 0.954, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0361, loss_cls: 0.1426, acc: 94.5156, loss_bbox: 0.1903, loss_mask: 0.2043, loss: 0.5834 2024-05-29 06:32:04,781 - mmdet - INFO - Epoch [11][5950/7330] lr: 1.000e-05, eta: 2:15:35, time: 0.920, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0379, loss_cls: 0.1506, acc: 94.2158, loss_bbox: 0.1989, loss_mask: 0.2088, loss: 0.6089 2024-05-29 06:32:50,728 - mmdet - INFO - Epoch [11][6000/7330] lr: 1.000e-05, eta: 2:14:48, time: 0.919, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0376, loss_cls: 0.1486, acc: 94.2207, loss_bbox: 0.1998, loss_mask: 0.2134, loss: 0.6118 2024-05-29 06:33:36,545 - mmdet - INFO - Epoch [11][6050/7330] lr: 1.000e-05, eta: 2:14:01, time: 0.916, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0384, loss_cls: 0.1488, acc: 94.2722, loss_bbox: 0.2040, loss_mask: 0.2122, loss: 0.6160 2024-05-29 06:34:21,868 - mmdet - INFO - Epoch [11][6100/7330] lr: 1.000e-05, eta: 2:13:15, time: 0.906, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0360, loss_cls: 0.1405, acc: 94.5947, loss_bbox: 0.1907, loss_mask: 0.2088, loss: 0.5875 2024-05-29 06:35:07,977 - mmdet - INFO - Epoch [11][6150/7330] lr: 1.000e-05, eta: 2:12:28, time: 0.922, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0371, loss_cls: 0.1501, acc: 94.2175, loss_bbox: 0.2025, loss_mask: 0.2135, loss: 0.6156 2024-05-29 06:35:56,752 - mmdet - INFO - Epoch [11][6200/7330] lr: 1.000e-05, eta: 2:11:41, time: 0.976, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0376, loss_cls: 0.1490, acc: 94.2000, loss_bbox: 0.2018, loss_mask: 0.2121, loss: 0.6127 2024-05-29 06:36:43,292 - mmdet - INFO - Epoch [11][6250/7330] lr: 1.000e-05, eta: 2:10:55, time: 0.931, data_time: 0.064, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0383, loss_cls: 0.1525, acc: 94.2222, loss_bbox: 0.2043, loss_mask: 0.2133, loss: 0.6209 2024-05-29 06:37:31,821 - mmdet - INFO - Epoch [11][6300/7330] lr: 1.000e-05, eta: 2:10:08, time: 0.971, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0370, loss_cls: 0.1440, acc: 94.5010, loss_bbox: 0.1955, loss_mask: 0.2047, loss: 0.5936 2024-05-29 06:38:17,237 - mmdet - INFO - Epoch [11][6350/7330] lr: 1.000e-05, eta: 2:09:21, time: 0.908, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0368, loss_cls: 0.1434, acc: 94.4792, loss_bbox: 0.1929, loss_mask: 0.2090, loss: 0.5938 2024-05-29 06:39:08,598 - mmdet - INFO - Epoch [11][6400/7330] lr: 1.000e-05, eta: 2:08:35, time: 1.027, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0368, loss_cls: 0.1489, acc: 94.3191, loss_bbox: 0.1999, loss_mask: 0.2123, loss: 0.6100 2024-05-29 06:39:54,218 - mmdet - INFO - Epoch [11][6450/7330] lr: 1.000e-05, eta: 2:07:48, time: 0.912, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0344, loss_cls: 0.1396, acc: 94.7156, loss_bbox: 0.1881, loss_mask: 0.2019, loss: 0.5744 2024-05-29 06:40:40,586 - mmdet - INFO - Epoch [11][6500/7330] lr: 1.000e-05, eta: 2:07:02, time: 0.927, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0353, loss_cls: 0.1400, acc: 94.5508, loss_bbox: 0.1917, loss_mask: 0.2052, loss: 0.5826 2024-05-29 06:41:31,491 - mmdet - INFO - Epoch [11][6550/7330] lr: 1.000e-05, eta: 2:06:15, time: 1.018, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0409, loss_cls: 0.1558, acc: 93.9785, loss_bbox: 0.2097, loss_mask: 0.2178, loss: 0.6369 2024-05-29 06:42:16,550 - mmdet - INFO - Epoch [11][6600/7330] lr: 1.000e-05, eta: 2:05:28, time: 0.901, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0362, loss_cls: 0.1444, acc: 94.4558, loss_bbox: 0.1971, loss_mask: 0.2110, loss: 0.5999 2024-05-29 06:43:04,741 - mmdet - INFO - Epoch [11][6650/7330] lr: 1.000e-05, eta: 2:04:42, time: 0.964, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0361, loss_cls: 0.1439, acc: 94.5396, loss_bbox: 0.1910, loss_mask: 0.2094, loss: 0.5914 2024-05-29 06:43:50,538 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 06:43:50,538 - mmdet - INFO - Epoch [11][6700/7330] lr: 1.000e-05, eta: 2:03:55, time: 0.915, data_time: 0.061, memory: 20925, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0360, loss_cls: 0.1443, acc: 94.4949, loss_bbox: 0.1923, loss_mask: 0.2118, loss: 0.5953 2024-05-29 06:44:36,066 - mmdet - INFO - Epoch [11][6750/7330] lr: 1.000e-05, eta: 2:03:08, time: 0.911, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0354, loss_cls: 0.1426, acc: 94.4963, loss_bbox: 0.1947, loss_mask: 0.2084, loss: 0.5920 2024-05-29 06:45:21,578 - mmdet - INFO - Epoch [11][6800/7330] lr: 1.000e-05, eta: 2:02:21, time: 0.910, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0347, loss_cls: 0.1418, acc: 94.5579, loss_bbox: 0.1940, loss_mask: 0.2082, loss: 0.5893 2024-05-29 06:46:07,239 - mmdet - INFO - Epoch [11][6850/7330] lr: 1.000e-05, eta: 2:01:35, time: 0.913, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0358, loss_cls: 0.1483, acc: 94.3286, loss_bbox: 0.1956, loss_mask: 0.2061, loss: 0.5965 2024-05-29 06:46:53,285 - mmdet - INFO - Epoch [11][6900/7330] lr: 1.000e-05, eta: 2:00:48, time: 0.921, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0367, loss_cls: 0.1412, acc: 94.6284, loss_bbox: 0.1899, loss_mask: 0.2042, loss: 0.5833 2024-05-29 06:47:40,854 - mmdet - INFO - Epoch [11][6950/7330] lr: 1.000e-05, eta: 2:00:01, time: 0.951, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0367, loss_cls: 0.1492, acc: 94.2722, loss_bbox: 0.2006, loss_mask: 0.2173, loss: 0.6154 2024-05-29 06:48:26,298 - mmdet - INFO - Epoch [11][7000/7330] lr: 1.000e-05, eta: 1:59:14, time: 0.909, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0371, loss_cls: 0.1462, acc: 94.3552, loss_bbox: 0.2003, loss_mask: 0.2107, loss: 0.6065 2024-05-29 06:49:12,760 - mmdet - INFO - Epoch [11][7050/7330] lr: 1.000e-05, eta: 1:58:28, time: 0.929, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0395, loss_cls: 0.1508, acc: 94.1685, loss_bbox: 0.2017, loss_mask: 0.2120, loss: 0.6165 2024-05-29 06:49:58,455 - mmdet - INFO - Epoch [11][7100/7330] lr: 1.000e-05, eta: 1:57:41, time: 0.914, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0363, loss_cls: 0.1428, acc: 94.4932, loss_bbox: 0.1898, loss_mask: 0.2102, loss: 0.5912 2024-05-29 06:50:44,169 - mmdet - INFO - Epoch [11][7150/7330] lr: 1.000e-05, eta: 1:56:54, time: 0.914, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0345, loss_cls: 0.1435, acc: 94.4719, loss_bbox: 0.1938, loss_mask: 0.2102, loss: 0.5922 2024-05-29 06:51:32,660 - mmdet - INFO - Epoch [11][7200/7330] lr: 1.000e-05, eta: 1:56:08, time: 0.970, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0342, loss_cls: 0.1377, acc: 94.7515, loss_bbox: 0.1877, loss_mask: 0.2060, loss: 0.5769 2024-05-29 06:52:18,388 - mmdet - INFO - Epoch [11][7250/7330] lr: 1.000e-05, eta: 1:55:21, time: 0.914, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0381, loss_cls: 0.1511, acc: 94.1187, loss_bbox: 0.2043, loss_mask: 0.2133, loss: 0.6190 2024-05-29 06:53:06,413 - mmdet - INFO - Epoch [11][7300/7330] lr: 1.000e-05, eta: 1:54:34, time: 0.961, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0368, loss_cls: 0.1410, acc: 94.5930, loss_bbox: 0.1885, loss_mask: 0.2063, loss: 0.5836 2024-05-29 06:53:34,658 - mmdet - INFO - Saving checkpoint at 11 epochs 2024-05-29 06:55:44,788 - mmdet - INFO - Evaluating bbox... 2024-05-29 06:56:09,425 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.498 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.725 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.545 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.314 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.539 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.670 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.610 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.610 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.610 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.421 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.651 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.774 2024-05-29 06:56:09,425 - mmdet - INFO - Evaluating segm... 2024-05-29 06:56:31,591 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.443 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.690 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.475 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.228 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.478 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.659 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.348 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.733 2024-05-29 06:56:31,935 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 06:56:31,937 - mmdet - INFO - Epoch(val) [11][625] bbox_mAP: 0.4980, bbox_mAP_50: 0.7250, bbox_mAP_75: 0.5450, bbox_mAP_s: 0.3140, bbox_mAP_m: 0.5390, bbox_mAP_l: 0.6700, bbox_mAP_copypaste: 0.498 0.725 0.545 0.314 0.539 0.670, segm_mAP: 0.4430, segm_mAP_50: 0.6900, segm_mAP_75: 0.4750, segm_mAP_s: 0.2280, segm_mAP_m: 0.4780, segm_mAP_l: 0.6590, segm_mAP_copypaste: 0.443 0.690 0.475 0.228 0.478 0.659 2024-05-29 06:57:29,708 - mmdet - INFO - Epoch [12][50/7330] lr: 1.000e-06, eta: 1:53:18, time: 1.155, data_time: 0.117, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0380, loss_cls: 0.1460, acc: 94.4177, loss_bbox: 0.1998, loss_mask: 0.2100, loss: 0.6055 2024-05-29 06:58:15,526 - mmdet - INFO - Epoch [12][100/7330] lr: 1.000e-06, eta: 1:52:31, time: 0.916, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0379, loss_cls: 0.1431, acc: 94.4995, loss_bbox: 0.1980, loss_mask: 0.2140, loss: 0.6045 2024-05-29 06:59:01,318 - mmdet - INFO - Epoch [12][150/7330] lr: 1.000e-06, eta: 1:51:44, time: 0.916, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0357, loss_cls: 0.1411, acc: 94.6177, loss_bbox: 0.1968, loss_mask: 0.2097, loss: 0.5936 2024-05-29 06:59:50,046 - mmdet - INFO - Epoch [12][200/7330] lr: 1.000e-06, eta: 1:50:58, time: 0.975, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0382, loss_cls: 0.1470, acc: 94.3621, loss_bbox: 0.2006, loss_mask: 0.2086, loss: 0.6061 2024-05-29 07:00:38,723 - mmdet - INFO - Epoch [12][250/7330] lr: 1.000e-06, eta: 1:50:11, time: 0.973, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0382, loss_cls: 0.1460, acc: 94.3787, loss_bbox: 0.1986, loss_mask: 0.2120, loss: 0.6062 2024-05-29 07:01:24,694 - mmdet - INFO - Epoch [12][300/7330] lr: 1.000e-06, eta: 1:49:25, time: 0.919, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0358, loss_cls: 0.1458, acc: 94.3682, loss_bbox: 0.1975, loss_mask: 0.2121, loss: 0.6018 2024-05-29 07:02:10,286 - mmdet - INFO - Epoch [12][350/7330] lr: 1.000e-06, eta: 1:48:38, time: 0.912, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0377, loss_cls: 0.1461, acc: 94.4060, loss_bbox: 0.1931, loss_mask: 0.2100, loss: 0.5980 2024-05-29 07:02:56,183 - mmdet - INFO - Epoch [12][400/7330] lr: 1.000e-06, eta: 1:47:51, time: 0.918, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0373, loss_cls: 0.1443, acc: 94.4595, loss_bbox: 0.1964, loss_mask: 0.2099, loss: 0.5993 2024-05-29 07:03:42,798 - mmdet - INFO - Epoch [12][450/7330] lr: 1.000e-06, eta: 1:47:04, time: 0.932, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0361, loss_cls: 0.1448, acc: 94.3613, loss_bbox: 0.2021, loss_mask: 0.2107, loss: 0.6056 2024-05-29 07:04:28,773 - mmdet - INFO - Epoch [12][500/7330] lr: 1.000e-06, eta: 1:46:18, time: 0.919, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0358, loss_cls: 0.1456, acc: 94.4272, loss_bbox: 0.1996, loss_mask: 0.2101, loss: 0.6018 2024-05-29 07:05:15,103 - mmdet - INFO - Epoch [12][550/7330] lr: 1.000e-06, eta: 1:45:31, time: 0.927, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0367, loss_cls: 0.1461, acc: 94.3218, loss_bbox: 0.1953, loss_mask: 0.2058, loss: 0.5950 2024-05-29 07:06:02,727 - mmdet - INFO - Epoch [12][600/7330] lr: 1.000e-06, eta: 1:44:44, time: 0.952, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0356, loss_cls: 0.1362, acc: 94.7815, loss_bbox: 0.1842, loss_mask: 0.2063, loss: 0.5724 2024-05-29 07:06:50,833 - mmdet - INFO - Epoch [12][650/7330] lr: 1.000e-06, eta: 1:43:58, time: 0.962, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0380, loss_cls: 0.1530, acc: 94.0967, loss_bbox: 0.2085, loss_mask: 0.2165, loss: 0.6277 2024-05-29 07:07:36,598 - mmdet - INFO - Epoch [12][700/7330] lr: 1.000e-06, eta: 1:43:11, time: 0.915, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0358, loss_cls: 0.1446, acc: 94.4668, loss_bbox: 0.1984, loss_mask: 0.2090, loss: 0.6000 2024-05-29 07:08:23,099 - mmdet - INFO - Epoch [12][750/7330] lr: 1.000e-06, eta: 1:42:24, time: 0.930, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0367, loss_cls: 0.1509, acc: 94.1492, loss_bbox: 0.2051, loss_mask: 0.2141, loss: 0.6189 2024-05-29 07:09:09,254 - mmdet - INFO - Epoch [12][800/7330] lr: 1.000e-06, eta: 1:41:37, time: 0.923, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0349, loss_cls: 0.1435, acc: 94.5144, loss_bbox: 0.1919, loss_mask: 0.2114, loss: 0.5927 2024-05-29 07:09:57,929 - mmdet - INFO - Epoch [12][850/7330] lr: 1.000e-06, eta: 1:40:51, time: 0.973, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0374, loss_cls: 0.1456, acc: 94.3779, loss_bbox: 0.1948, loss_mask: 0.2088, loss: 0.5981 2024-05-29 07:10:44,296 - mmdet - INFO - Epoch [12][900/7330] lr: 1.000e-06, eta: 1:40:04, time: 0.927, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0390, loss_cls: 0.1498, acc: 94.2549, loss_bbox: 0.2032, loss_mask: 0.2127, loss: 0.6160 2024-05-29 07:11:30,215 - mmdet - INFO - Epoch [12][950/7330] lr: 1.000e-06, eta: 1:39:17, time: 0.918, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0365, loss_cls: 0.1453, acc: 94.4438, loss_bbox: 0.1937, loss_mask: 0.2098, loss: 0.5961 2024-05-29 07:12:16,128 - mmdet - INFO - Epoch [12][1000/7330] lr: 1.000e-06, eta: 1:38:31, time: 0.918, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0348, loss_cls: 0.1428, acc: 94.5747, loss_bbox: 0.1939, loss_mask: 0.2051, loss: 0.5874 2024-05-29 07:13:04,518 - mmdet - INFO - Epoch [12][1050/7330] lr: 1.000e-06, eta: 1:37:44, time: 0.968, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0327, loss_cls: 0.1375, acc: 94.6843, loss_bbox: 0.1875, loss_mask: 0.2061, loss: 0.5746 2024-05-29 07:13:52,441 - mmdet - INFO - Epoch [12][1100/7330] lr: 1.000e-06, eta: 1:36:58, time: 0.958, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0095, loss_rpn_bbox: 0.0342, loss_cls: 0.1399, acc: 94.6243, loss_bbox: 0.1910, loss_mask: 0.2065, loss: 0.5811 2024-05-29 07:14:38,624 - mmdet - INFO - Epoch [12][1150/7330] lr: 1.000e-06, eta: 1:36:11, time: 0.924, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0364, loss_cls: 0.1414, acc: 94.5850, loss_bbox: 0.1917, loss_mask: 0.2079, loss: 0.5889 2024-05-29 07:15:24,977 - mmdet - INFO - Epoch [12][1200/7330] lr: 1.000e-06, eta: 1:35:24, time: 0.927, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0346, loss_cls: 0.1414, acc: 94.5515, loss_bbox: 0.1954, loss_mask: 0.2103, loss: 0.5927 2024-05-29 07:16:13,537 - mmdet - INFO - Epoch [12][1250/7330] lr: 1.000e-06, eta: 1:34:38, time: 0.971, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0386, loss_cls: 0.1435, acc: 94.4460, loss_bbox: 0.1997, loss_mask: 0.2087, loss: 0.6028 2024-05-29 07:17:02,569 - mmdet - INFO - Epoch [12][1300/7330] lr: 1.000e-06, eta: 1:33:51, time: 0.981, data_time: 0.054, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0377, loss_cls: 0.1452, acc: 94.4182, loss_bbox: 0.1971, loss_mask: 0.2106, loss: 0.6014 2024-05-29 07:17:48,626 - mmdet - INFO - Epoch [12][1350/7330] lr: 1.000e-06, eta: 1:33:04, time: 0.921, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0362, loss_cls: 0.1408, acc: 94.4888, loss_bbox: 0.1970, loss_mask: 0.2094, loss: 0.5941 2024-05-29 07:18:37,579 - mmdet - INFO - Epoch [12][1400/7330] lr: 1.000e-06, eta: 1:32:18, time: 0.979, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0361, loss_cls: 0.1421, acc: 94.6025, loss_bbox: 0.1902, loss_mask: 0.2045, loss: 0.5843 2024-05-29 07:19:24,012 - mmdet - INFO - Epoch [12][1450/7330] lr: 1.000e-06, eta: 1:31:31, time: 0.929, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0382, loss_cls: 0.1474, acc: 94.2561, loss_bbox: 0.1986, loss_mask: 0.2079, loss: 0.6033 2024-05-29 07:20:10,911 - mmdet - INFO - Epoch [12][1500/7330] lr: 1.000e-06, eta: 1:30:44, time: 0.938, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0369, loss_cls: 0.1470, acc: 94.4011, loss_bbox: 0.1978, loss_mask: 0.2119, loss: 0.6055 2024-05-29 07:20:56,676 - mmdet - INFO - Epoch [12][1550/7330] lr: 1.000e-06, eta: 1:29:58, time: 0.915, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0359, loss_cls: 0.1399, acc: 94.5310, loss_bbox: 0.1950, loss_mask: 0.2102, loss: 0.5918 2024-05-29 07:21:42,455 - mmdet - INFO - Epoch [12][1600/7330] lr: 1.000e-06, eta: 1:29:11, time: 0.916, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0346, loss_cls: 0.1405, acc: 94.5818, loss_bbox: 0.1932, loss_mask: 0.2084, loss: 0.5871 2024-05-29 07:22:30,406 - mmdet - INFO - Epoch [12][1650/7330] lr: 1.000e-06, eta: 1:28:24, time: 0.959, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0345, loss_cls: 0.1382, acc: 94.6899, loss_bbox: 0.1841, loss_mask: 0.2013, loss: 0.5681 2024-05-29 07:23:18,640 - mmdet - INFO - Epoch [12][1700/7330] lr: 1.000e-06, eta: 1:27:38, time: 0.965, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0366, loss_cls: 0.1403, acc: 94.5410, loss_bbox: 0.1962, loss_mask: 0.2044, loss: 0.5891 2024-05-29 07:24:04,647 - mmdet - INFO - Epoch [12][1750/7330] lr: 1.000e-06, eta: 1:26:51, time: 0.920, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0369, loss_cls: 0.1448, acc: 94.4092, loss_bbox: 0.1932, loss_mask: 0.2046, loss: 0.5906 2024-05-29 07:24:51,454 - mmdet - INFO - Epoch [12][1800/7330] lr: 1.000e-06, eta: 1:26:04, time: 0.936, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0360, loss_cls: 0.1379, acc: 94.7070, loss_bbox: 0.1889, loss_mask: 0.2087, loss: 0.5833 2024-05-29 07:25:37,259 - mmdet - INFO - Epoch [12][1850/7330] lr: 1.000e-06, eta: 1:25:17, time: 0.916, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0367, loss_cls: 0.1423, acc: 94.5505, loss_bbox: 0.1906, loss_mask: 0.2059, loss: 0.5873 2024-05-29 07:26:25,831 - mmdet - INFO - Epoch [12][1900/7330] lr: 1.000e-06, eta: 1:24:31, time: 0.971, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0388, loss_cls: 0.1468, acc: 94.3076, loss_bbox: 0.1979, loss_mask: 0.2119, loss: 0.6072 2024-05-29 07:27:12,612 - mmdet - INFO - Epoch [12][1950/7330] lr: 1.000e-06, eta: 1:23:44, time: 0.936, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0391, loss_cls: 0.1466, acc: 94.3438, loss_bbox: 0.1959, loss_mask: 0.2095, loss: 0.6045 2024-05-29 07:27:59,217 - mmdet - INFO - Epoch [12][2000/7330] lr: 1.000e-06, eta: 1:22:57, time: 0.932, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0364, loss_cls: 0.1355, acc: 94.7480, loss_bbox: 0.1851, loss_mask: 0.2049, loss: 0.5718 2024-05-29 07:28:45,533 - mmdet - INFO - Epoch [12][2050/7330] lr: 1.000e-06, eta: 1:22:11, time: 0.926, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0336, loss_cls: 0.1402, acc: 94.6089, loss_bbox: 0.1885, loss_mask: 0.2012, loss: 0.5742 2024-05-29 07:29:33,977 - mmdet - INFO - Epoch [12][2100/7330] lr: 1.000e-06, eta: 1:21:24, time: 0.969, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0371, loss_cls: 0.1476, acc: 94.2947, loss_bbox: 0.2033, loss_mask: 0.2098, loss: 0.6092 2024-05-29 07:30:22,231 - mmdet - INFO - Epoch [12][2150/7330] lr: 1.000e-06, eta: 1:20:38, time: 0.965, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0349, loss_cls: 0.1426, acc: 94.5217, loss_bbox: 0.1928, loss_mask: 0.2049, loss: 0.5855 2024-05-29 07:31:08,386 - mmdet - INFO - Epoch [12][2200/7330] lr: 1.000e-06, eta: 1:19:51, time: 0.923, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0362, loss_cls: 0.1421, acc: 94.5034, loss_bbox: 0.1948, loss_mask: 0.2082, loss: 0.5928 2024-05-29 07:31:57,188 - mmdet - INFO - Epoch [12][2250/7330] lr: 1.000e-06, eta: 1:19:04, time: 0.976, data_time: 0.060, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0367, loss_cls: 0.1399, acc: 94.5964, loss_bbox: 0.1901, loss_mask: 0.2082, loss: 0.5862 2024-05-29 07:32:47,095 - mmdet - INFO - Epoch [12][2300/7330] lr: 1.000e-06, eta: 1:18:18, time: 0.998, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0373, loss_cls: 0.1413, acc: 94.5317, loss_bbox: 0.1925, loss_mask: 0.2065, loss: 0.5879 2024-05-29 07:33:34,196 - mmdet - INFO - Epoch [12][2350/7330] lr: 1.000e-06, eta: 1:17:31, time: 0.942, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0371, loss_cls: 0.1460, acc: 94.3884, loss_bbox: 0.1987, loss_mask: 0.2127, loss: 0.6057 2024-05-29 07:34:19,764 - mmdet - INFO - Epoch [12][2400/7330] lr: 1.000e-06, eta: 1:16:44, time: 0.911, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0095, loss_rpn_bbox: 0.0356, loss_cls: 0.1417, acc: 94.4944, loss_bbox: 0.1947, loss_mask: 0.2049, loss: 0.5865 2024-05-29 07:35:08,813 - mmdet - INFO - Epoch [12][2450/7330] lr: 1.000e-06, eta: 1:15:58, time: 0.981, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0348, loss_cls: 0.1385, acc: 94.7837, loss_bbox: 0.1827, loss_mask: 0.2038, loss: 0.5704 2024-05-29 07:35:55,211 - mmdet - INFO - Epoch [12][2500/7330] lr: 1.000e-06, eta: 1:15:11, time: 0.928, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0386, loss_cls: 0.1480, acc: 94.3303, loss_bbox: 0.1993, loss_mask: 0.2108, loss: 0.6086 2024-05-29 07:36:41,450 - mmdet - INFO - Epoch [12][2550/7330] lr: 1.000e-06, eta: 1:14:24, time: 0.925, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0358, loss_cls: 0.1414, acc: 94.4692, loss_bbox: 0.1958, loss_mask: 0.2101, loss: 0.5943 2024-05-29 07:37:27,435 - mmdet - INFO - Epoch [12][2600/7330] lr: 1.000e-06, eta: 1:13:38, time: 0.920, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0368, loss_cls: 0.1473, acc: 94.2876, loss_bbox: 0.2029, loss_mask: 0.2158, loss: 0.6138 2024-05-29 07:38:13,960 - mmdet - INFO - Epoch [12][2650/7330] lr: 1.000e-06, eta: 1:12:51, time: 0.930, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0394, loss_cls: 0.1542, acc: 94.1050, loss_bbox: 0.2046, loss_mask: 0.2123, loss: 0.6221 2024-05-29 07:39:02,739 - mmdet - INFO - Epoch [12][2700/7330] lr: 1.000e-06, eta: 1:12:04, time: 0.976, data_time: 0.057, memory: 20925, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0349, loss_cls: 0.1388, acc: 94.6157, loss_bbox: 0.1940, loss_mask: 0.2036, loss: 0.5815 2024-05-29 07:39:51,218 - mmdet - INFO - Epoch [12][2750/7330] lr: 1.000e-06, eta: 1:11:18, time: 0.970, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0366, loss_cls: 0.1440, acc: 94.4951, loss_bbox: 0.1996, loss_mask: 0.2055, loss: 0.5968 2024-05-29 07:40:37,893 - mmdet - INFO - Epoch [12][2800/7330] lr: 1.000e-06, eta: 1:10:31, time: 0.934, data_time: 0.038, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0352, loss_cls: 0.1396, acc: 94.6343, loss_bbox: 0.1915, loss_mask: 0.2080, loss: 0.5858 2024-05-29 07:41:23,796 - mmdet - INFO - Epoch [12][2850/7330] lr: 1.000e-06, eta: 1:09:44, time: 0.918, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0381, loss_cls: 0.1456, acc: 94.4060, loss_bbox: 0.1924, loss_mask: 0.2104, loss: 0.5978 2024-05-29 07:42:10,284 - mmdet - INFO - Epoch [12][2900/7330] lr: 1.000e-06, eta: 1:08:57, time: 0.930, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0356, loss_cls: 0.1464, acc: 94.3914, loss_bbox: 0.1959, loss_mask: 0.2058, loss: 0.5945 2024-05-29 07:42:58,849 - mmdet - INFO - Epoch [12][2950/7330] lr: 1.000e-06, eta: 1:08:11, time: 0.971, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0344, loss_cls: 0.1374, acc: 94.6875, loss_bbox: 0.1870, loss_mask: 0.2012, loss: 0.5705 2024-05-29 07:43:44,480 - mmdet - INFO - Epoch [12][3000/7330] lr: 1.000e-06, eta: 1:07:24, time: 0.913, data_time: 0.034, memory: 20925, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0328, loss_cls: 0.1354, acc: 94.7673, loss_bbox: 0.1852, loss_mask: 0.2024, loss: 0.5662 2024-05-29 07:44:30,147 - mmdet - INFO - Epoch [12][3050/7330] lr: 1.000e-06, eta: 1:06:37, time: 0.913, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0356, loss_cls: 0.1390, acc: 94.5784, loss_bbox: 0.1894, loss_mask: 0.2091, loss: 0.5843 2024-05-29 07:45:16,582 - mmdet - INFO - Epoch [12][3100/7330] lr: 1.000e-06, eta: 1:05:51, time: 0.929, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0356, loss_cls: 0.1418, acc: 94.5732, loss_bbox: 0.1939, loss_mask: 0.2059, loss: 0.5878 2024-05-29 07:46:05,634 - mmdet - INFO - Epoch [12][3150/7330] lr: 1.000e-06, eta: 1:05:04, time: 0.981, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0372, loss_cls: 0.1383, acc: 94.7356, loss_bbox: 0.1878, loss_mask: 0.2047, loss: 0.5781 2024-05-29 07:46:53,339 - mmdet - INFO - Epoch [12][3200/7330] lr: 1.000e-06, eta: 1:04:17, time: 0.954, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0337, loss_cls: 0.1358, acc: 94.7607, loss_bbox: 0.1862, loss_mask: 0.2059, loss: 0.5720 2024-05-29 07:47:40,074 - mmdet - INFO - Epoch [12][3250/7330] lr: 1.000e-06, eta: 1:03:31, time: 0.935, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0386, loss_cls: 0.1508, acc: 94.2061, loss_bbox: 0.2052, loss_mask: 0.2113, loss: 0.6173 2024-05-29 07:48:28,596 - mmdet - INFO - Epoch [12][3300/7330] lr: 1.000e-06, eta: 1:02:44, time: 0.971, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0368, loss_cls: 0.1483, acc: 94.2551, loss_bbox: 0.2037, loss_mask: 0.2111, loss: 0.6106 2024-05-29 07:49:17,132 - mmdet - INFO - Epoch [12][3350/7330] lr: 1.000e-06, eta: 1:01:57, time: 0.971, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0361, loss_cls: 0.1447, acc: 94.4250, loss_bbox: 0.1962, loss_mask: 0.2082, loss: 0.5956 2024-05-29 07:50:02,946 - mmdet - INFO - Epoch [12][3400/7330] lr: 1.000e-06, eta: 1:01:11, time: 0.916, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0345, loss_cls: 0.1416, acc: 94.5974, loss_bbox: 0.1903, loss_mask: 0.2097, loss: 0.5867 2024-05-29 07:50:49,066 - mmdet - INFO - Epoch [12][3450/7330] lr: 1.000e-06, eta: 1:00:24, time: 0.922, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0372, loss_cls: 0.1484, acc: 94.2639, loss_bbox: 0.1954, loss_mask: 0.2104, loss: 0.6031 2024-05-29 07:51:37,741 - mmdet - INFO - Epoch [12][3500/7330] lr: 1.000e-06, eta: 0:59:37, time: 0.974, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0347, loss_cls: 0.1420, acc: 94.5295, loss_bbox: 0.1883, loss_mask: 0.2042, loss: 0.5804 2024-05-29 07:52:23,727 - mmdet - INFO - Epoch [12][3550/7330] lr: 1.000e-06, eta: 0:58:51, time: 0.920, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0375, loss_cls: 0.1428, acc: 94.5022, loss_bbox: 0.1942, loss_mask: 0.2123, loss: 0.5983 2024-05-29 07:53:09,417 - mmdet - INFO - Epoch [12][3600/7330] lr: 1.000e-06, eta: 0:58:04, time: 0.914, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0353, loss_cls: 0.1388, acc: 94.7632, loss_bbox: 0.1919, loss_mask: 0.2015, loss: 0.5778 2024-05-29 07:53:55,298 - mmdet - INFO - Epoch [12][3650/7330] lr: 1.000e-06, eta: 0:57:17, time: 0.918, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0366, loss_cls: 0.1471, acc: 94.3828, loss_bbox: 0.1997, loss_mask: 0.2079, loss: 0.6029 2024-05-29 07:54:41,226 - mmdet - INFO - Epoch [12][3700/7330] lr: 1.000e-06, eta: 0:56:30, time: 0.918, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0374, loss_cls: 0.1519, acc: 94.1794, loss_bbox: 0.2043, loss_mask: 0.2090, loss: 0.6143 2024-05-29 07:55:32,050 - mmdet - INFO - Epoch [12][3750/7330] lr: 1.000e-06, eta: 0:55:44, time: 1.017, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0362, loss_cls: 0.1358, acc: 94.7974, loss_bbox: 0.1824, loss_mask: 0.2058, loss: 0.5702 2024-05-29 07:56:17,716 - mmdet - INFO - Epoch [12][3800/7330] lr: 1.000e-06, eta: 0:54:57, time: 0.913, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0357, loss_cls: 0.1430, acc: 94.4893, loss_bbox: 0.1948, loss_mask: 0.2069, loss: 0.5918 2024-05-29 07:57:04,986 - mmdet - INFO - Epoch [12][3850/7330] lr: 1.000e-06, eta: 0:54:10, time: 0.945, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0383, loss_cls: 0.1425, acc: 94.5310, loss_bbox: 0.1934, loss_mask: 0.2103, loss: 0.5966 2024-05-29 07:57:51,858 - mmdet - INFO - Epoch [12][3900/7330] lr: 1.000e-06, eta: 0:53:24, time: 0.937, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0355, loss_cls: 0.1404, acc: 94.6162, loss_bbox: 0.1874, loss_mask: 0.2014, loss: 0.5757 2024-05-29 07:58:40,881 - mmdet - INFO - Epoch [12][3950/7330] lr: 1.000e-06, eta: 0:52:37, time: 0.981, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0361, loss_cls: 0.1425, acc: 94.4817, loss_bbox: 0.1955, loss_mask: 0.2098, loss: 0.5958 2024-05-29 07:59:27,193 - mmdet - INFO - Epoch [12][4000/7330] lr: 1.000e-06, eta: 0:51:50, time: 0.926, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0387, loss_cls: 0.1492, acc: 94.2695, loss_bbox: 0.1998, loss_mask: 0.2114, loss: 0.6106 2024-05-29 08:00:13,583 - mmdet - INFO - Epoch [12][4050/7330] lr: 1.000e-06, eta: 0:51:04, time: 0.928, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0399, loss_cls: 0.1466, acc: 94.3960, loss_bbox: 0.1965, loss_mask: 0.2079, loss: 0.6027 2024-05-29 08:00:59,484 - mmdet - INFO - Epoch [12][4100/7330] lr: 1.000e-06, eta: 0:50:17, time: 0.918, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0382, loss_cls: 0.1443, acc: 94.4631, loss_bbox: 0.1977, loss_mask: 0.2108, loss: 0.6019 2024-05-29 08:01:47,864 - mmdet - INFO - Epoch [12][4150/7330] lr: 1.000e-06, eta: 0:49:30, time: 0.968, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0363, loss_cls: 0.1417, acc: 94.5000, loss_bbox: 0.1940, loss_mask: 0.2094, loss: 0.5926 2024-05-29 08:02:33,852 - mmdet - INFO - Epoch [12][4200/7330] lr: 1.000e-06, eta: 0:48:43, time: 0.920, data_time: 0.058, memory: 20925, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0378, loss_cls: 0.1445, acc: 94.3547, loss_bbox: 0.1971, loss_mask: 0.2129, loss: 0.6050 2024-05-29 08:03:22,086 - mmdet - INFO - Epoch [12][4250/7330] lr: 1.000e-06, eta: 0:47:57, time: 0.965, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0369, loss_cls: 0.1434, acc: 94.4517, loss_bbox: 0.1946, loss_mask: 0.2091, loss: 0.5949 2024-05-29 08:04:07,391 - mmdet - INFO - Epoch [12][4300/7330] lr: 1.000e-06, eta: 0:47:10, time: 0.906, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0348, loss_cls: 0.1387, acc: 94.6816, loss_bbox: 0.1916, loss_mask: 0.2064, loss: 0.5821 2024-05-29 08:04:55,352 - mmdet - INFO - Epoch [12][4350/7330] lr: 1.000e-06, eta: 0:46:23, time: 0.959, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0352, loss_cls: 0.1349, acc: 94.7769, loss_bbox: 0.1829, loss_mask: 0.2012, loss: 0.5652 2024-05-29 08:05:44,110 - mmdet - INFO - Epoch [12][4400/7330] lr: 1.000e-06, eta: 0:45:37, time: 0.975, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0386, loss_cls: 0.1486, acc: 94.2529, loss_bbox: 0.2004, loss_mask: 0.2109, loss: 0.6102 2024-05-29 08:06:30,210 - mmdet - INFO - Epoch [12][4450/7330] lr: 1.000e-06, eta: 0:44:50, time: 0.922, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0371, loss_cls: 0.1488, acc: 94.2444, loss_bbox: 0.1993, loss_mask: 0.2088, loss: 0.6054 2024-05-29 08:07:18,239 - mmdet - INFO - Epoch [12][4500/7330] lr: 1.000e-06, eta: 0:44:03, time: 0.961, data_time: 0.065, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0364, loss_cls: 0.1417, acc: 94.6038, loss_bbox: 0.1899, loss_mask: 0.2063, loss: 0.5859 2024-05-29 08:08:04,234 - mmdet - INFO - Epoch [12][4550/7330] lr: 1.000e-06, eta: 0:43:17, time: 0.920, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0385, loss_cls: 0.1537, acc: 93.9724, loss_bbox: 0.2113, loss_mask: 0.2161, loss: 0.6314 2024-05-29 08:08:50,264 - mmdet - INFO - Epoch [12][4600/7330] lr: 1.000e-06, eta: 0:42:30, time: 0.921, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0355, loss_cls: 0.1410, acc: 94.6194, loss_bbox: 0.1917, loss_mask: 0.2099, loss: 0.5879 2024-05-29 08:09:36,398 - mmdet - INFO - Epoch [12][4650/7330] lr: 1.000e-06, eta: 0:41:43, time: 0.922, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0352, loss_cls: 0.1426, acc: 94.5291, loss_bbox: 0.1911, loss_mask: 0.2100, loss: 0.5904 2024-05-29 08:10:21,879 - mmdet - INFO - Epoch [12][4700/7330] lr: 1.000e-06, eta: 0:40:56, time: 0.910, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0366, loss_cls: 0.1456, acc: 94.2988, loss_bbox: 0.1952, loss_mask: 0.2070, loss: 0.5963 2024-05-29 08:11:08,011 - mmdet - INFO - Epoch [12][4750/7330] lr: 1.000e-06, eta: 0:40:10, time: 0.923, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0388, loss_cls: 0.1538, acc: 94.0884, loss_bbox: 0.2068, loss_mask: 0.2104, loss: 0.6222 2024-05-29 08:11:58,191 - mmdet - INFO - Epoch [12][4800/7330] lr: 1.000e-06, eta: 0:39:23, time: 1.004, data_time: 0.053, memory: 20925, loss_rpn_cls: 0.0098, loss_rpn_bbox: 0.0336, loss_cls: 0.1384, acc: 94.5911, loss_bbox: 0.1876, loss_mask: 0.2044, loss: 0.5738 2024-05-29 08:12:44,275 - mmdet - INFO - Epoch [12][4850/7330] lr: 1.000e-06, eta: 0:38:36, time: 0.922, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0095, loss_rpn_bbox: 0.0361, loss_cls: 0.1385, acc: 94.6787, loss_bbox: 0.1936, loss_mask: 0.2091, loss: 0.5867 2024-05-29 08:13:30,644 - mmdet - INFO - Epoch [12][4900/7330] lr: 1.000e-06, eta: 0:37:50, time: 0.927, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0355, loss_cls: 0.1419, acc: 94.5107, loss_bbox: 0.1908, loss_mask: 0.2081, loss: 0.5882 2024-05-29 08:14:16,371 - mmdet - INFO - Epoch [12][4950/7330] lr: 1.000e-06, eta: 0:37:03, time: 0.915, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0352, loss_cls: 0.1413, acc: 94.6304, loss_bbox: 0.1883, loss_mask: 0.2093, loss: 0.5851 2024-05-29 08:15:04,524 - mmdet - INFO - Epoch [12][5000/7330] lr: 1.000e-06, eta: 0:36:16, time: 0.963, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0095, loss_rpn_bbox: 0.0361, loss_cls: 0.1384, acc: 94.6282, loss_bbox: 0.1916, loss_mask: 0.2071, loss: 0.5827 2024-05-29 08:15:50,023 - mmdet - INFO - Epoch [12][5050/7330] lr: 1.000e-06, eta: 0:35:29, time: 0.910, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0352, loss_cls: 0.1393, acc: 94.6362, loss_bbox: 0.1900, loss_mask: 0.2025, loss: 0.5785 2024-05-29 08:16:36,115 - mmdet - INFO - Epoch [12][5100/7330] lr: 1.000e-06, eta: 0:34:43, time: 0.922, data_time: 0.049, memory: 20925, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0358, loss_cls: 0.1438, acc: 94.5146, loss_bbox: 0.1975, loss_mask: 0.2032, loss: 0.5910 2024-05-29 08:17:22,629 - mmdet - INFO - Epoch [12][5150/7330] lr: 1.000e-06, eta: 0:33:56, time: 0.930, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0389, loss_cls: 0.1437, acc: 94.4739, loss_bbox: 0.1962, loss_mask: 0.2070, loss: 0.5979 2024-05-29 08:18:11,607 - mmdet - INFO - Epoch [12][5200/7330] lr: 1.000e-06, eta: 0:33:09, time: 0.979, data_time: 0.056, memory: 20925, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0370, loss_cls: 0.1432, acc: 94.5105, loss_bbox: 0.1950, loss_mask: 0.2100, loss: 0.5966 2024-05-29 08:18:57,693 - mmdet - INFO - Epoch [12][5250/7330] lr: 1.000e-06, eta: 0:32:23, time: 0.922, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0375, loss_cls: 0.1452, acc: 94.4131, loss_bbox: 0.1956, loss_mask: 0.2065, loss: 0.5957 2024-05-29 08:19:46,305 - mmdet - INFO - Epoch [12][5300/7330] lr: 1.000e-06, eta: 0:31:36, time: 0.972, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0392, loss_cls: 0.1475, acc: 94.3196, loss_bbox: 0.2001, loss_mask: 0.2123, loss: 0.6116 2024-05-29 08:20:32,417 - mmdet - INFO - Epoch [12][5350/7330] lr: 1.000e-06, eta: 0:30:49, time: 0.922, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0358, loss_cls: 0.1479, acc: 94.2583, loss_bbox: 0.1991, loss_mask: 0.2096, loss: 0.6027 2024-05-29 08:21:19,621 - mmdet - INFO - Epoch [12][5400/7330] lr: 1.000e-06, eta: 0:30:03, time: 0.944, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0351, loss_cls: 0.1350, acc: 94.7888, loss_bbox: 0.1904, loss_mask: 0.2032, loss: 0.5737 2024-05-29 08:22:07,872 - mmdet - INFO - Epoch [12][5450/7330] lr: 1.000e-06, eta: 0:29:16, time: 0.965, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0371, loss_cls: 0.1482, acc: 94.3948, loss_bbox: 0.2004, loss_mask: 0.2164, loss: 0.6139 2024-05-29 08:22:53,645 - mmdet - INFO - Epoch [12][5500/7330] lr: 1.000e-06, eta: 0:28:29, time: 0.915, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0384, loss_cls: 0.1459, acc: 94.2903, loss_bbox: 0.1995, loss_mask: 0.2172, loss: 0.6127 2024-05-29 08:23:42,021 - mmdet - INFO - Epoch [12][5550/7330] lr: 1.000e-06, eta: 0:27:42, time: 0.968, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0324, loss_cls: 0.1371, acc: 94.7100, loss_bbox: 0.1883, loss_mask: 0.2047, loss: 0.5732 2024-05-29 08:24:27,931 - mmdet - INFO - Epoch [12][5600/7330] lr: 1.000e-06, eta: 0:26:56, time: 0.918, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0375, loss_cls: 0.1508, acc: 94.1467, loss_bbox: 0.2048, loss_mask: 0.2116, loss: 0.6164 2024-05-29 08:25:13,617 - mmdet - INFO - Epoch [12][5650/7330] lr: 1.000e-06, eta: 0:26:09, time: 0.914, data_time: 0.045, memory: 20925, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0352, loss_cls: 0.1426, acc: 94.5029, loss_bbox: 0.1961, loss_mask: 0.2140, loss: 0.5987 2024-05-29 08:25:59,730 - mmdet - INFO - Epoch [12][5700/7330] lr: 1.000e-06, eta: 0:25:22, time: 0.922, data_time: 0.063, memory: 20925, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0381, loss_cls: 0.1495, acc: 94.3132, loss_bbox: 0.2023, loss_mask: 0.2067, loss: 0.6088 2024-05-29 08:26:46,140 - mmdet - INFO - Epoch [12][5750/7330] lr: 1.000e-06, eta: 0:24:36, time: 0.928, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0351, loss_cls: 0.1408, acc: 94.5610, loss_bbox: 0.1938, loss_mask: 0.2085, loss: 0.5893 2024-05-29 08:27:32,841 - mmdet - INFO - Epoch [12][5800/7330] lr: 1.000e-06, eta: 0:23:49, time: 0.934, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0378, loss_cls: 0.1439, acc: 94.4255, loss_bbox: 0.1970, loss_mask: 0.2082, loss: 0.5975 2024-05-29 08:28:23,207 - mmdet - INFO - Epoch [12][5850/7330] lr: 1.000e-06, eta: 0:23:02, time: 1.007, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0357, loss_cls: 0.1383, acc: 94.6655, loss_bbox: 0.1893, loss_mask: 0.2022, loss: 0.5762 2024-05-29 08:29:09,300 - mmdet - INFO - Epoch [12][5900/7330] lr: 1.000e-06, eta: 0:22:15, time: 0.922, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0388, loss_cls: 0.1473, acc: 94.2588, loss_bbox: 0.2004, loss_mask: 0.2090, loss: 0.6072 2024-05-29 08:29:55,573 - mmdet - INFO - Epoch [12][5950/7330] lr: 1.000e-06, eta: 0:21:29, time: 0.925, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0380, loss_cls: 0.1411, acc: 94.5354, loss_bbox: 0.1957, loss_mask: 0.2084, loss: 0.5941 2024-05-29 08:30:41,664 - mmdet - INFO - Epoch [12][6000/7330] lr: 1.000e-06, eta: 0:20:42, time: 0.922, data_time: 0.036, memory: 20925, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0342, loss_cls: 0.1391, acc: 94.6462, loss_bbox: 0.1895, loss_mask: 0.2053, loss: 0.5780 2024-05-29 08:31:31,170 - mmdet - INFO - Epoch [12][6050/7330] lr: 1.000e-06, eta: 0:19:55, time: 0.990, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0370, loss_cls: 0.1452, acc: 94.4170, loss_bbox: 0.1986, loss_mask: 0.2128, loss: 0.6049 2024-05-29 08:32:18,242 - mmdet - INFO - Epoch [12][6100/7330] lr: 1.000e-06, eta: 0:19:09, time: 0.941, data_time: 0.059, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0372, loss_cls: 0.1430, acc: 94.4785, loss_bbox: 0.1945, loss_mask: 0.2058, loss: 0.5917 2024-05-29 08:33:05,028 - mmdet - INFO - Epoch [12][6150/7330] lr: 1.000e-06, eta: 0:18:22, time: 0.936, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0373, loss_cls: 0.1472, acc: 94.2966, loss_bbox: 0.1972, loss_mask: 0.2109, loss: 0.6042 2024-05-29 08:33:50,586 - mmdet - INFO - Epoch [12][6200/7330] lr: 1.000e-06, eta: 0:17:35, time: 0.911, data_time: 0.039, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0352, loss_cls: 0.1431, acc: 94.5142, loss_bbox: 0.1935, loss_mask: 0.2079, loss: 0.5909 2024-05-29 08:34:39,121 - mmdet - INFO - Epoch [12][6250/7330] lr: 1.000e-06, eta: 0:16:49, time: 0.971, data_time: 0.055, memory: 20925, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0354, loss_cls: 0.1449, acc: 94.4358, loss_bbox: 0.1937, loss_mask: 0.2087, loss: 0.5933 2024-05-29 08:35:25,010 - mmdet - INFO - Epoch [12][6300/7330] lr: 1.000e-06, eta: 0:16:02, time: 0.918, data_time: 0.038, memory: 20925, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0333, loss_cls: 0.1398, acc: 94.5994, loss_bbox: 0.1915, loss_mask: 0.2081, loss: 0.5832 2024-05-29 08:36:13,187 - mmdet - INFO - Epoch [12][6350/7330] lr: 1.000e-06, eta: 0:15:15, time: 0.964, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0343, loss_cls: 0.1348, acc: 94.8088, loss_bbox: 0.1861, loss_mask: 0.2024, loss: 0.5682 2024-05-29 08:37:01,978 - mmdet - INFO - Epoch [12][6400/7330] lr: 1.000e-06, eta: 0:14:28, time: 0.976, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0395, loss_cls: 0.1485, acc: 94.2566, loss_bbox: 0.2013, loss_mask: 0.2144, loss: 0.6157 2024-05-29 08:37:47,549 - mmdet - INFO - Epoch [12][6450/7330] lr: 1.000e-06, eta: 0:13:42, time: 0.911, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0338, loss_cls: 0.1416, acc: 94.5112, loss_bbox: 0.1940, loss_mask: 0.2033, loss: 0.5833 2024-05-29 08:38:35,487 - mmdet - INFO - Epoch [12][6500/7330] lr: 1.000e-06, eta: 0:12:55, time: 0.959, data_time: 0.050, memory: 20925, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0387, loss_cls: 0.1474, acc: 94.3113, loss_bbox: 0.2007, loss_mask: 0.2085, loss: 0.6072 2024-05-29 08:39:21,196 - mmdet - INFO - Epoch [12][6550/7330] lr: 1.000e-06, eta: 0:12:08, time: 0.914, data_time: 0.052, memory: 20925, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0379, loss_cls: 0.1407, acc: 94.5759, loss_bbox: 0.1946, loss_mask: 0.2088, loss: 0.5943 2024-05-29 08:40:10,063 - mmdet - INFO - Epoch [12][6600/7330] lr: 1.000e-06, eta: 0:11:22, time: 0.978, data_time: 0.036, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0365, loss_cls: 0.1462, acc: 94.4036, loss_bbox: 0.1969, loss_mask: 0.2071, loss: 0.5975 2024-05-29 08:40:56,038 - mmdet - INFO - Epoch [12][6650/7330] lr: 1.000e-06, eta: 0:10:35, time: 0.919, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0362, loss_cls: 0.1429, acc: 94.4346, loss_bbox: 0.1961, loss_mask: 0.2138, loss: 0.6001 2024-05-29 08:41:42,321 - mmdet - INFO - Epoch [12][6700/7330] lr: 1.000e-06, eta: 0:09:48, time: 0.926, data_time: 0.044, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0367, loss_cls: 0.1463, acc: 94.3921, loss_bbox: 0.1986, loss_mask: 0.2089, loss: 0.6020 2024-05-29 08:42:28,889 - mmdet - INFO - Epoch [12][6750/7330] lr: 1.000e-06, eta: 0:09:01, time: 0.931, data_time: 0.062, memory: 20925, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0371, loss_cls: 0.1510, acc: 94.2729, loss_bbox: 0.1985, loss_mask: 0.2115, loss: 0.6102 2024-05-29 08:43:14,703 - mmdet - INFO - Epoch [12][6800/7330] lr: 1.000e-06, eta: 0:08:15, time: 0.916, data_time: 0.047, memory: 20925, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0356, loss_cls: 0.1462, acc: 94.4036, loss_bbox: 0.1950, loss_mask: 0.2087, loss: 0.5965 2024-05-29 08:44:00,383 - mmdet - INFO - Epoch [12][6850/7330] lr: 1.000e-06, eta: 0:07:28, time: 0.914, data_time: 0.046, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0372, loss_cls: 0.1398, acc: 94.5859, loss_bbox: 0.1905, loss_mask: 0.2082, loss: 0.5865 2024-05-29 08:44:50,426 - mmdet - INFO - Epoch [12][6900/7330] lr: 1.000e-06, eta: 0:06:41, time: 1.001, data_time: 0.051, memory: 20925, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0364, loss_cls: 0.1416, acc: 94.5708, loss_bbox: 0.1932, loss_mask: 0.2102, loss: 0.5924 2024-05-29 08:45:37,370 - mmdet - INFO - Epoch [12][6950/7330] lr: 1.000e-06, eta: 0:05:55, time: 0.939, data_time: 0.036, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0384, loss_cls: 0.1468, acc: 94.3401, loss_bbox: 0.2011, loss_mask: 0.2116, loss: 0.6095 2024-05-29 08:46:22,961 - mmdet - INFO - Epoch [12][7000/7330] lr: 1.000e-06, eta: 0:05:08, time: 0.912, data_time: 0.041, memory: 20925, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0347, loss_cls: 0.1438, acc: 94.5286, loss_bbox: 0.1946, loss_mask: 0.2044, loss: 0.5888 2024-05-29 08:47:08,559 - mmdet - INFO - Epoch [12][7050/7330] lr: 1.000e-06, eta: 0:04:21, time: 0.912, data_time: 0.042, memory: 20925, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0355, loss_cls: 0.1421, acc: 94.4897, loss_bbox: 0.1952, loss_mask: 0.2095, loss: 0.5940 2024-05-29 08:47:56,692 - mmdet - INFO - Epoch [12][7100/7330] lr: 1.000e-06, eta: 0:03:34, time: 0.963, data_time: 0.040, memory: 20925, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0338, loss_cls: 0.1388, acc: 94.6682, loss_bbox: 0.1915, loss_mask: 0.2073, loss: 0.5815 2024-05-29 08:48:42,887 - mmdet - INFO - Epoch [12][7150/7330] lr: 1.000e-06, eta: 0:02:48, time: 0.924, data_time: 0.048, memory: 20925, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0367, loss_cls: 0.1419, acc: 94.5369, loss_bbox: 0.1890, loss_mask: 0.2012, loss: 0.5798 2024-05-29 08:49:28,835 - mmdet - INFO - Epoch [12][7200/7330] lr: 1.000e-06, eta: 0:02:01, time: 0.919, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0345, loss_cls: 0.1359, acc: 94.7722, loss_bbox: 0.1857, loss_mask: 0.2043, loss: 0.5705 2024-05-29 08:50:15,371 - mmdet - INFO - Epoch [12][7250/7330] lr: 1.000e-06, eta: 0:01:14, time: 0.931, data_time: 0.043, memory: 20925, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0351, loss_cls: 0.1337, acc: 94.8801, loss_bbox: 0.1847, loss_mask: 0.2037, loss: 0.5680 2024-05-29 08:51:03,358 - mmdet - INFO - Epoch [12][7300/7330] lr: 1.000e-06, eta: 0:00:28, time: 0.960, data_time: 0.035, memory: 20925, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0354, loss_cls: 0.1373, acc: 94.7344, loss_bbox: 0.1884, loss_mask: 0.2071, loss: 0.5792 2024-05-29 08:51:31,559 - mmdet - INFO - Saving checkpoint at 12 epochs 2024-05-29 08:53:39,207 - mmdet - INFO - Evaluating bbox... 2024-05-29 08:53:59,843 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.500 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.726 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.548 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.324 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.541 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.670 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.611 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.611 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.611 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.432 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.653 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.773 2024-05-29 08:53:59,843 - mmdet - INFO - Evaluating segm... 2024-05-29 08:54:21,301 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.444 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.691 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.478 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.226 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.479 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.659 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.347 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.592 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.732 2024-05-29 08:54:21,575 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 08:54:21,577 - mmdet - INFO - Epoch(val) [12][625] bbox_mAP: 0.5000, bbox_mAP_50: 0.7260, bbox_mAP_75: 0.5480, bbox_mAP_s: 0.3240, bbox_mAP_m: 0.5410, bbox_mAP_l: 0.6700, bbox_mAP_copypaste: 0.500 0.726 0.548 0.324 0.541 0.670, segm_mAP: 0.4440, segm_mAP_50: 0.6910, segm_mAP_75: 0.4780, segm_mAP_s: 0.2260, segm_mAP_m: 0.4790, segm_mAP_l: 0.6590, segm_mAP_copypaste: 0.444 0.691 0.478 0.226 0.479 0.659