2024-05-31 22:22:38,954 - mmdet - INFO - Environment info: ------------------------------------------------------------ sys.platform: linux Python: 3.9.19 (main, May 6 2024, 19:43:03) [GCC 11.2.0] CUDA available: True GPU 0,1,2,3,4,5,6,7: NVIDIA A100-SXM4-80GB CUDA_HOME: /mnt/petrelfs/share/cuda-11.7/ NVCC: Cuda compilation tools, release 11.7, V11.7.99 GCC: gcc (GCC) 7.3.0 PyTorch: 1.12.0+cu113 PyTorch compiling details: PyTorch built with: - GCC 9.3 - C++ Version: 201402 - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v2.6.0 (Git Hash 52b5f107dd9cf10910aaa19cb47f3abf9b349815) - OpenMP 201511 (a.k.a. OpenMP 4.5) - LAPACK is enabled (usually provided by MKL) - NNPACK is enabled - CPU capability usage: AVX2 - CUDA Runtime 11.3 - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86 - CuDNN 8.3.2 (built against CUDA 11.5) - Magma 2.5.2 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.3, CUDNN_VERSION=8.3.2, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.12.0, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=OFF, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, TorchVision: 0.13.0+cu113 OpenCV: 4.9.0 MMCV: 1.7.0 MMCV Compiler: GCC 7.3 MMCV CUDA Compiler: 11.7 MMDetection: 2.25.3+807c71c ------------------------------------------------------------ 2024-05-31 22:22:40,439 - mmdet - INFO - Distributed training: True 2024-05-31 22:22:41,923 - mmdet - INFO - Config: model = dict( type='MaskRCNN', backbone=dict( type='PIIPThreeBranch', n_points=4, deform_num_heads=16, cffn_ratio=0.25, deform_ratio=0.5, with_cffn=True, interact_attn_type='deform', interaction_drop_path_rate=0.4, branch1=dict( real_size=448, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=24, embed_dim=1024, num_heads=16, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.4, init_scale=1.0, with_fpn=False, interaction_indexes=[[0, 1], [2, 3], [4, 5], [6, 7], [8, 9], [10, 11], [12, 13], [14, 15], [16, 17], [18, 19], [20, 21], [22, 23]], pretrained='./pretrained/deit_3_large_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[ 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28 ], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch2=dict( real_size=896, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=768, num_heads=12, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.15, init_scale=1.0, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_3_base_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch3=dict( real_size=1344, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=384, num_heads=6, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.05, init_scale=1.0, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_3_small_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True))), neck=dict( type='FPN', in_channels=[1024, 1024, 1024, 1024], out_channels=256, num_outs=5), rpn_head=dict( type='RPNHead', in_channels=256, feat_channels=256, anchor_generator=dict( type='AnchorGenerator', scales=[8], ratios=[0.5, 1.0, 2.0], strides=[4, 8, 16, 32, 64]), bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[1.0, 1.0, 1.0, 1.0]), loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), roi_head=dict( type='StandardRoIHead', bbox_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=7, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), bbox_head=dict( type='Shared2FCBBoxHead', in_channels=256, fc_out_channels=1024, roi_feat_size=7, num_classes=80, bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[0.1, 0.1, 0.2, 0.2]), reg_class_agnostic=False, loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), mask_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=14, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), mask_head=dict( type='FCNMaskHead', num_convs=4, in_channels=256, conv_out_channels=256, num_classes=80, loss_mask=dict( type='CrossEntropyLoss', use_mask=True, loss_weight=1.0))), train_cfg=dict( rpn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.7, neg_iou_thr=0.3, min_pos_iou=0.3, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=256, pos_fraction=0.5, neg_pos_ub=-1, add_gt_as_proposals=False), allowed_border=-1, pos_weight=-1, debug=False), rpn_proposal=dict( nms_pre=2000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.5, neg_iou_thr=0.5, min_pos_iou=0.5, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=512, pos_fraction=0.25, neg_pos_ub=-1, add_gt_as_proposals=True), mask_size=28, pos_weight=-1, debug=False)), test_cfg=dict( rpn=dict( nms_pre=1000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( score_thr=0.05, nms=dict(type='nms', iou_threshold=0.5), max_per_img=100, mask_thr_binary=0.5))) dataset_type = 'CocoDataset' data_root = 'data/coco/' img_norm_cfg = dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True) train_pipeline = [ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1344, 806), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ] test_pipeline = [ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1344, 806), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ] data = dict( samples_per_gpu=2, workers_per_gpu=2, train=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_train2017.json', img_prefix='data/coco/train2017/', pipeline=[ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1344, 806), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='DefaultFormatBundle'), dict( type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ]), val=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1344, 806), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ]), test=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1344, 806), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ])) evaluation = dict(metric=['bbox', 'segm'], interval=1, save_best=None) optimizer = dict( type='AdamW', lr=0.0001, betas=(0.9, 0.999), weight_decay=0.05, constructor='CustomLayerDecayOptimizerConstructorMMDet', paramwise_cfg=dict( num_layers=24, layer_decay_rate=0.85, skip_stride=[2, 2])) optimizer_config = dict(grad_clip=None) lr_config = dict( policy='step', warmup='linear', warmup_iters=500, warmup_ratio=0.001, step=[8, 11]) runner = dict(type='EpochBasedRunner', max_epochs=12) checkpoint_config = dict(interval=1, deepspeed=True, max_keep_ckpts=1) log_config = dict(interval=50, hooks=[dict(type='TextLoggerHook')]) custom_hooks = [dict(type='ToBFloat16HookMMDet', priority=49)] dist_params = dict(backend='nccl') log_level = 'INFO' load_from = None resume_from = None workflow = [('train', 1)] opencv_num_threads = 0 mp_start_method = 'fork' auto_scale_lr = dict(enable=False, base_batch_size=16) deepspeed = True deepspeed_config = 'zero_configs/adam_zero1_bf16.json' custom_imports = dict( imports=['mmdet.mmcv_custom'], allow_failed_imports=False) work_dir = './work_dirs/mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16' auto_resume = True gpu_ids = range(0, 8) 2024-05-31 22:22:47,071 - mmdet - INFO - Set random seed to 530467520, deterministic: False 2024-05-31 22:22:54,237 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-31 22:22:55,629 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-31 22:22:57,641 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-31 22:24:14,115 - mmdet - INFO - initialize FPN with init_cfg {'type': 'Xavier', 'layer': 'Conv2d', 'distribution': 'uniform'} 2024-05-31 22:24:14,648 - mmdet - INFO - initialize RPNHead with init_cfg {'type': 'Normal', 'layer': 'Conv2d', 'std': 0.01} 2024-05-31 22:24:14,721 - mmdet - INFO - initialize Shared2FCBBoxHead with init_cfg [{'type': 'Normal', 'std': 0.01, 'override': {'name': 'fc_cls'}}, {'type': 'Normal', 'std': 0.001, 'override': {'name': 'fc_reg'}}, {'type': 'Xavier', 'distribution': 'uniform', 'override': [{'name': 'shared_fcs'}, {'name': 'cls_fcs'}, {'name': 'reg_fcs'}]}] Name of parameter - Initialization information backbone.w1 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w2 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w3 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.pos_embed - torch.Size([1, 196, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.patch_embed.proj.weight - torch.Size([1024, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.patch_embed.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.pos_embed - torch.Size([1, 196, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.patch_embed.proj.weight - torch.Size([768, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.patch_embed.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.pos_embed - torch.Size([1, 196, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.patch_embed.proj.weight - torch.Size([384, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.patch_embed.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.0.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.0.weight - torch.Size([1024, 768, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.0.weight - torch.Size([1024, 384, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.0.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.3.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.3.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn2.0.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn2.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.0.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.1.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.2.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.3.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.0.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.1.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.2.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.3.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN rpn_head.rpn_conv.weight - torch.Size([256, 256, 3, 3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_conv.bias - torch.Size([256]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.weight - torch.Size([3, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.bias - torch.Size([3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.weight - torch.Size([12, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.bias - torch.Size([12]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_cls.weight - torch.Size([81, 1024]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_cls.bias - torch.Size([81]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_reg.weight - torch.Size([320, 1024]): NormalInit: mean=0, std=0.001, bias=0 roi_head.bbox_head.fc_reg.bias - torch.Size([320]): NormalInit: mean=0, std=0.001, bias=0 roi_head.bbox_head.shared_fcs.0.weight - torch.Size([1024, 12544]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.0.bias - torch.Size([1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.1.weight - torch.Size([1024, 1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.1.bias - torch.Size([1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.mask_head.convs.0.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.1.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.2.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.3.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.upsample.weight - torch.Size([256, 256, 2, 2]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.upsample.bias - torch.Size([256]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.conv_logits.weight - torch.Size([80, 256, 1, 1]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.conv_logits.bias - torch.Size([80]): Initialized by user-defined `init_weights` in FCNMaskHead 2024-05-31 22:24:32,260 - mmdet - INFO - {'num_layers': 24, 'layer_decay_rate': 0.85, 'skip_stride': [2, 2]} 2024-05-31 22:24:32,260 - mmdet - INFO - Build LayerDecayOptimizerConstructor 0.850000 - 26 2024-05-31 22:24:32,274 - mmdet - INFO - Param groups = { "layer_25_decay": { "param_names": [ "backbone.w1", "backbone.w2", "backbone.w3", "backbone.interactions.0.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.merge_branch1.0.weight", "backbone.merge_branch1.3.weight", "backbone.merge_branch2.0.weight", "backbone.merge_branch2.3.weight", "backbone.merge_branch3.0.weight", "backbone.merge_branch3.3.weight", "backbone.fpn1.0.weight", "backbone.fpn1.3.weight", "backbone.fpn2.0.weight", "neck.lateral_convs.0.conv.weight", "neck.lateral_convs.1.conv.weight", "neck.lateral_convs.2.conv.weight", "neck.lateral_convs.3.conv.weight", "neck.fpn_convs.0.conv.weight", "neck.fpn_convs.1.conv.weight", "neck.fpn_convs.2.conv.weight", "neck.fpn_convs.3.conv.weight", "rpn_head.rpn_conv.weight", "rpn_head.rpn_cls.weight", "rpn_head.rpn_reg.weight", "roi_head.bbox_head.fc_cls.weight", "roi_head.bbox_head.fc_reg.weight", "roi_head.bbox_head.shared_fcs.0.weight", "roi_head.bbox_head.shared_fcs.1.weight", "roi_head.mask_head.convs.0.conv.weight", "roi_head.mask_head.convs.1.conv.weight", "roi_head.mask_head.convs.2.conv.weight", "roi_head.mask_head.convs.3.conv.weight", "roi_head.mask_head.upsample.weight", "roi_head.mask_head.conv_logits.weight" ], "lr_scale": 1.0, "lr": 0.0001, "weight_decay": 0.05 }, "layer_0_decay": { "param_names": [ "backbone.branch1.pos_embed", "backbone.branch1.patch_embed.proj.weight", "backbone.branch2.pos_embed", "backbone.branch2.patch_embed.proj.weight", "backbone.branch3.pos_embed", "backbone.branch3.patch_embed.proj.weight" ], "lr_scale": 0.017197809852207896, "lr": 1.7197809852207897e-06, "weight_decay": 0.05 }, "layer_0_no_decay": { "param_names": [ "backbone.branch1.patch_embed.proj.bias", "backbone.branch2.patch_embed.proj.bias", "backbone.branch3.patch_embed.proj.bias" ], "lr_scale": 0.017197809852207896, "lr": 1.7197809852207897e-06, "weight_decay": 0.0 }, "layer_1_no_decay": { "param_names": [ "backbone.branch1.blocks.0.gamma_1", "backbone.branch1.blocks.0.gamma_2", "backbone.branch1.blocks.0.norm1.weight", "backbone.branch1.blocks.0.norm1.bias", "backbone.branch1.blocks.0.attn.qkv.bias", "backbone.branch1.blocks.0.attn.proj.bias", "backbone.branch1.blocks.0.norm2.weight", "backbone.branch1.blocks.0.norm2.bias", "backbone.branch1.blocks.0.mlp.fc1.bias", "backbone.branch1.blocks.0.mlp.fc2.bias" ], "lr_scale": 0.02023271747318576, "lr": 2.023271747318576e-06, "weight_decay": 0.0 }, "layer_1_decay": { "param_names": [ "backbone.branch1.blocks.0.attn.qkv.weight", "backbone.branch1.blocks.0.attn.proj.weight", "backbone.branch1.blocks.0.mlp.fc1.weight", "backbone.branch1.blocks.0.mlp.fc2.weight" ], "lr_scale": 0.02023271747318576, "lr": 2.023271747318576e-06, "weight_decay": 0.05 }, "layer_2_no_decay": { "param_names": [ "backbone.branch1.blocks.1.gamma_1", "backbone.branch1.blocks.1.gamma_2", "backbone.branch1.blocks.1.norm1.weight", "backbone.branch1.blocks.1.norm1.bias", "backbone.branch1.blocks.1.attn.qkv.bias", "backbone.branch1.blocks.1.attn.proj.bias", "backbone.branch1.blocks.1.norm2.weight", "backbone.branch1.blocks.1.norm2.bias", "backbone.branch1.blocks.1.mlp.fc1.bias", "backbone.branch1.blocks.1.mlp.fc2.bias", "backbone.branch2.blocks.0.gamma_1", "backbone.branch2.blocks.0.gamma_2", "backbone.branch2.blocks.0.norm1.weight", "backbone.branch2.blocks.0.norm1.bias", "backbone.branch2.blocks.0.attn.qkv.bias", "backbone.branch2.blocks.0.attn.proj.bias", "backbone.branch2.blocks.0.norm2.weight", "backbone.branch2.blocks.0.norm2.bias", "backbone.branch2.blocks.0.mlp.fc1.bias", "backbone.branch2.blocks.0.mlp.fc2.bias", "backbone.branch3.blocks.0.gamma_1", "backbone.branch3.blocks.0.gamma_2", "backbone.branch3.blocks.0.norm1.weight", "backbone.branch3.blocks.0.norm1.bias", "backbone.branch3.blocks.0.attn.qkv.bias", "backbone.branch3.blocks.0.attn.proj.bias", "backbone.branch3.blocks.0.norm2.weight", "backbone.branch3.blocks.0.norm2.bias", "backbone.branch3.blocks.0.mlp.fc1.bias", "backbone.branch3.blocks.0.mlp.fc2.bias" ], "lr_scale": 0.023803197027277366, "lr": 2.380319702727737e-06, "weight_decay": 0.0 }, "layer_2_decay": { "param_names": [ "backbone.branch1.blocks.1.attn.qkv.weight", "backbone.branch1.blocks.1.attn.proj.weight", "backbone.branch1.blocks.1.mlp.fc1.weight", "backbone.branch1.blocks.1.mlp.fc2.weight", "backbone.branch2.blocks.0.attn.qkv.weight", "backbone.branch2.blocks.0.attn.proj.weight", "backbone.branch2.blocks.0.mlp.fc1.weight", "backbone.branch2.blocks.0.mlp.fc2.weight", "backbone.branch3.blocks.0.attn.qkv.weight", "backbone.branch3.blocks.0.attn.proj.weight", "backbone.branch3.blocks.0.mlp.fc1.weight", "backbone.branch3.blocks.0.mlp.fc2.weight" ], "lr_scale": 0.023803197027277366, "lr": 2.380319702727737e-06, "weight_decay": 0.05 }, "layer_3_no_decay": { "param_names": [ "backbone.branch1.blocks.2.gamma_1", "backbone.branch1.blocks.2.gamma_2", "backbone.branch1.blocks.2.norm1.weight", "backbone.branch1.blocks.2.norm1.bias", "backbone.branch1.blocks.2.attn.qkv.bias", "backbone.branch1.blocks.2.attn.proj.bias", "backbone.branch1.blocks.2.norm2.weight", "backbone.branch1.blocks.2.norm2.bias", "backbone.branch1.blocks.2.mlp.fc1.bias", "backbone.branch1.blocks.2.mlp.fc2.bias" ], "lr_scale": 0.028003761208561607, "lr": 2.8003761208561607e-06, "weight_decay": 0.0 }, "layer_3_decay": { "param_names": [ "backbone.branch1.blocks.2.attn.qkv.weight", "backbone.branch1.blocks.2.attn.proj.weight", "backbone.branch1.blocks.2.mlp.fc1.weight", "backbone.branch1.blocks.2.mlp.fc2.weight" ], "lr_scale": 0.028003761208561607, "lr": 2.8003761208561607e-06, "weight_decay": 0.05 }, "layer_4_no_decay": { "param_names": [ "backbone.branch1.blocks.3.gamma_1", "backbone.branch1.blocks.3.gamma_2", "backbone.branch1.blocks.3.norm1.weight", "backbone.branch1.blocks.3.norm1.bias", "backbone.branch1.blocks.3.attn.qkv.bias", "backbone.branch1.blocks.3.attn.proj.bias", "backbone.branch1.blocks.3.norm2.weight", "backbone.branch1.blocks.3.norm2.bias", "backbone.branch1.blocks.3.mlp.fc1.bias", "backbone.branch1.blocks.3.mlp.fc2.bias", "backbone.branch2.blocks.1.gamma_1", "backbone.branch2.blocks.1.gamma_2", "backbone.branch2.blocks.1.norm1.weight", "backbone.branch2.blocks.1.norm1.bias", "backbone.branch2.blocks.1.attn.qkv.bias", "backbone.branch2.blocks.1.attn.proj.bias", "backbone.branch2.blocks.1.norm2.weight", "backbone.branch2.blocks.1.norm2.bias", "backbone.branch2.blocks.1.mlp.fc1.bias", "backbone.branch2.blocks.1.mlp.fc2.bias", "backbone.branch3.blocks.1.gamma_1", "backbone.branch3.blocks.1.gamma_2", "backbone.branch3.blocks.1.norm1.weight", "backbone.branch3.blocks.1.norm1.bias", "backbone.branch3.blocks.1.attn.qkv.bias", "backbone.branch3.blocks.1.attn.proj.bias", "backbone.branch3.blocks.1.norm2.weight", "backbone.branch3.blocks.1.norm2.bias", "backbone.branch3.blocks.1.mlp.fc1.bias", "backbone.branch3.blocks.1.mlp.fc2.bias" ], "lr_scale": 0.03294560142183718, "lr": 3.2945601421837183e-06, "weight_decay": 0.0 }, "layer_4_decay": { "param_names": [ "backbone.branch1.blocks.3.attn.qkv.weight", "backbone.branch1.blocks.3.attn.proj.weight", "backbone.branch1.blocks.3.mlp.fc1.weight", "backbone.branch1.blocks.3.mlp.fc2.weight", "backbone.branch2.blocks.1.attn.qkv.weight", "backbone.branch2.blocks.1.attn.proj.weight", "backbone.branch2.blocks.1.mlp.fc1.weight", "backbone.branch2.blocks.1.mlp.fc2.weight", "backbone.branch3.blocks.1.attn.qkv.weight", "backbone.branch3.blocks.1.attn.proj.weight", "backbone.branch3.blocks.1.mlp.fc1.weight", "backbone.branch3.blocks.1.mlp.fc2.weight" ], "lr_scale": 0.03294560142183718, "lr": 3.2945601421837183e-06, "weight_decay": 0.05 }, "layer_5_no_decay": { "param_names": [ "backbone.branch1.blocks.4.gamma_1", "backbone.branch1.blocks.4.gamma_2", "backbone.branch1.blocks.4.norm1.weight", "backbone.branch1.blocks.4.norm1.bias", "backbone.branch1.blocks.4.attn.qkv.bias", "backbone.branch1.blocks.4.attn.proj.bias", "backbone.branch1.blocks.4.norm2.weight", "backbone.branch1.blocks.4.norm2.bias", "backbone.branch1.blocks.4.mlp.fc1.bias", "backbone.branch1.blocks.4.mlp.fc2.bias" ], "lr_scale": 0.03875953108451433, "lr": 3.875953108451433e-06, "weight_decay": 0.0 }, "layer_5_decay": { "param_names": [ "backbone.branch1.blocks.4.attn.qkv.weight", "backbone.branch1.blocks.4.attn.proj.weight", "backbone.branch1.blocks.4.mlp.fc1.weight", "backbone.branch1.blocks.4.mlp.fc2.weight" ], "lr_scale": 0.03875953108451433, "lr": 3.875953108451433e-06, "weight_decay": 0.05 }, "layer_6_no_decay": { "param_names": [ "backbone.branch1.blocks.5.gamma_1", "backbone.branch1.blocks.5.gamma_2", "backbone.branch1.blocks.5.norm1.weight", "backbone.branch1.blocks.5.norm1.bias", "backbone.branch1.blocks.5.attn.qkv.bias", "backbone.branch1.blocks.5.attn.proj.bias", "backbone.branch1.blocks.5.norm2.weight", "backbone.branch1.blocks.5.norm2.bias", "backbone.branch1.blocks.5.mlp.fc1.bias", "backbone.branch1.blocks.5.mlp.fc2.bias", "backbone.branch2.blocks.2.gamma_1", "backbone.branch2.blocks.2.gamma_2", "backbone.branch2.blocks.2.norm1.weight", "backbone.branch2.blocks.2.norm1.bias", "backbone.branch2.blocks.2.attn.qkv.bias", "backbone.branch2.blocks.2.attn.proj.bias", "backbone.branch2.blocks.2.norm2.weight", "backbone.branch2.blocks.2.norm2.bias", "backbone.branch2.blocks.2.mlp.fc1.bias", "backbone.branch2.blocks.2.mlp.fc2.bias", "backbone.branch3.blocks.2.gamma_1", "backbone.branch3.blocks.2.gamma_2", "backbone.branch3.blocks.2.norm1.weight", "backbone.branch3.blocks.2.norm1.bias", "backbone.branch3.blocks.2.attn.qkv.bias", "backbone.branch3.blocks.2.attn.proj.bias", "backbone.branch3.blocks.2.norm2.weight", "backbone.branch3.blocks.2.norm2.bias", "backbone.branch3.blocks.2.mlp.fc1.bias", "backbone.branch3.blocks.2.mlp.fc2.bias" ], "lr_scale": 0.04559944833472275, "lr": 4.5599448334722756e-06, "weight_decay": 0.0 }, "layer_6_decay": { "param_names": [ "backbone.branch1.blocks.5.attn.qkv.weight", "backbone.branch1.blocks.5.attn.proj.weight", "backbone.branch1.blocks.5.mlp.fc1.weight", "backbone.branch1.blocks.5.mlp.fc2.weight", "backbone.branch2.blocks.2.attn.qkv.weight", "backbone.branch2.blocks.2.attn.proj.weight", "backbone.branch2.blocks.2.mlp.fc1.weight", "backbone.branch2.blocks.2.mlp.fc2.weight", "backbone.branch3.blocks.2.attn.qkv.weight", "backbone.branch3.blocks.2.attn.proj.weight", "backbone.branch3.blocks.2.mlp.fc1.weight", "backbone.branch3.blocks.2.mlp.fc2.weight" ], "lr_scale": 0.04559944833472275, "lr": 4.5599448334722756e-06, "weight_decay": 0.05 }, "layer_7_no_decay": { "param_names": [ "backbone.branch1.blocks.6.gamma_1", "backbone.branch1.blocks.6.gamma_2", "backbone.branch1.blocks.6.norm1.weight", "backbone.branch1.blocks.6.norm1.bias", "backbone.branch1.blocks.6.attn.qkv.bias", "backbone.branch1.blocks.6.attn.proj.bias", "backbone.branch1.blocks.6.norm2.weight", "backbone.branch1.blocks.6.norm2.bias", "backbone.branch1.blocks.6.mlp.fc1.bias", "backbone.branch1.blocks.6.mlp.fc2.bias" ], "lr_scale": 0.053646409805556176, "lr": 5.364640980555618e-06, "weight_decay": 0.0 }, "layer_7_decay": { "param_names": [ "backbone.branch1.blocks.6.attn.qkv.weight", "backbone.branch1.blocks.6.attn.proj.weight", "backbone.branch1.blocks.6.mlp.fc1.weight", "backbone.branch1.blocks.6.mlp.fc2.weight" ], "lr_scale": 0.053646409805556176, "lr": 5.364640980555618e-06, "weight_decay": 0.05 }, "layer_8_no_decay": { "param_names": [ "backbone.branch1.blocks.7.gamma_1", "backbone.branch1.blocks.7.gamma_2", "backbone.branch1.blocks.7.norm1.weight", "backbone.branch1.blocks.7.norm1.bias", "backbone.branch1.blocks.7.attn.qkv.bias", "backbone.branch1.blocks.7.attn.proj.bias", "backbone.branch1.blocks.7.norm2.weight", "backbone.branch1.blocks.7.norm2.bias", "backbone.branch1.blocks.7.mlp.fc1.bias", "backbone.branch1.blocks.7.mlp.fc2.bias", "backbone.branch2.blocks.3.gamma_1", "backbone.branch2.blocks.3.gamma_2", "backbone.branch2.blocks.3.norm1.weight", "backbone.branch2.blocks.3.norm1.bias", "backbone.branch2.blocks.3.attn.qkv.bias", "backbone.branch2.blocks.3.attn.proj.bias", "backbone.branch2.blocks.3.norm2.weight", "backbone.branch2.blocks.3.norm2.bias", "backbone.branch2.blocks.3.mlp.fc1.bias", "backbone.branch2.blocks.3.mlp.fc2.bias", "backbone.branch3.blocks.3.gamma_1", "backbone.branch3.blocks.3.gamma_2", "backbone.branch3.blocks.3.norm1.weight", "backbone.branch3.blocks.3.norm1.bias", "backbone.branch3.blocks.3.attn.qkv.bias", "backbone.branch3.blocks.3.attn.proj.bias", "backbone.branch3.blocks.3.norm2.weight", "backbone.branch3.blocks.3.norm2.bias", "backbone.branch3.blocks.3.mlp.fc1.bias", "backbone.branch3.blocks.3.mlp.fc2.bias" ], "lr_scale": 0.06311342330065432, "lr": 6.3113423300654325e-06, "weight_decay": 0.0 }, "layer_8_decay": { "param_names": [ "backbone.branch1.blocks.7.attn.qkv.weight", "backbone.branch1.blocks.7.attn.proj.weight", "backbone.branch1.blocks.7.mlp.fc1.weight", "backbone.branch1.blocks.7.mlp.fc2.weight", "backbone.branch2.blocks.3.attn.qkv.weight", "backbone.branch2.blocks.3.attn.proj.weight", "backbone.branch2.blocks.3.mlp.fc1.weight", "backbone.branch2.blocks.3.mlp.fc2.weight", "backbone.branch3.blocks.3.attn.qkv.weight", "backbone.branch3.blocks.3.attn.proj.weight", "backbone.branch3.blocks.3.mlp.fc1.weight", "backbone.branch3.blocks.3.mlp.fc2.weight" ], "lr_scale": 0.06311342330065432, "lr": 6.3113423300654325e-06, "weight_decay": 0.05 }, "layer_9_no_decay": { "param_names": [ "backbone.branch1.blocks.8.gamma_1", "backbone.branch1.blocks.8.gamma_2", "backbone.branch1.blocks.8.norm1.weight", "backbone.branch1.blocks.8.norm1.bias", "backbone.branch1.blocks.8.attn.qkv.bias", "backbone.branch1.blocks.8.attn.proj.bias", "backbone.branch1.blocks.8.norm2.weight", "backbone.branch1.blocks.8.norm2.bias", "backbone.branch1.blocks.8.mlp.fc1.bias", "backbone.branch1.blocks.8.mlp.fc2.bias" ], "lr_scale": 0.07425108623606391, "lr": 7.425108623606392e-06, "weight_decay": 0.0 }, "layer_9_decay": { "param_names": [ "backbone.branch1.blocks.8.attn.qkv.weight", "backbone.branch1.blocks.8.attn.proj.weight", "backbone.branch1.blocks.8.mlp.fc1.weight", "backbone.branch1.blocks.8.mlp.fc2.weight" ], "lr_scale": 0.07425108623606391, "lr": 7.425108623606392e-06, "weight_decay": 0.05 }, "layer_10_no_decay": { "param_names": [ "backbone.branch1.blocks.9.gamma_1", "backbone.branch1.blocks.9.gamma_2", "backbone.branch1.blocks.9.norm1.weight", "backbone.branch1.blocks.9.norm1.bias", "backbone.branch1.blocks.9.attn.qkv.bias", "backbone.branch1.blocks.9.attn.proj.bias", "backbone.branch1.blocks.9.norm2.weight", "backbone.branch1.blocks.9.norm2.bias", "backbone.branch1.blocks.9.mlp.fc1.bias", "backbone.branch1.blocks.9.mlp.fc2.bias", "backbone.branch2.blocks.4.gamma_1", "backbone.branch2.blocks.4.gamma_2", "backbone.branch2.blocks.4.norm1.weight", "backbone.branch2.blocks.4.norm1.bias", "backbone.branch2.blocks.4.attn.qkv.bias", "backbone.branch2.blocks.4.attn.proj.bias", "backbone.branch2.blocks.4.norm2.weight", "backbone.branch2.blocks.4.norm2.bias", "backbone.branch2.blocks.4.mlp.fc1.bias", "backbone.branch2.blocks.4.mlp.fc2.bias", "backbone.branch3.blocks.4.gamma_1", "backbone.branch3.blocks.4.gamma_2", "backbone.branch3.blocks.4.norm1.weight", "backbone.branch3.blocks.4.norm1.bias", "backbone.branch3.blocks.4.attn.qkv.bias", "backbone.branch3.blocks.4.attn.proj.bias", "backbone.branch3.blocks.4.norm2.weight", "backbone.branch3.blocks.4.norm2.bias", "backbone.branch3.blocks.4.mlp.fc1.bias", "backbone.branch3.blocks.4.mlp.fc2.bias" ], "lr_scale": 0.08735421910125167, "lr": 8.735421910125167e-06, "weight_decay": 0.0 }, "layer_10_decay": { "param_names": [ "backbone.branch1.blocks.9.attn.qkv.weight", "backbone.branch1.blocks.9.attn.proj.weight", "backbone.branch1.blocks.9.mlp.fc1.weight", "backbone.branch1.blocks.9.mlp.fc2.weight", "backbone.branch2.blocks.4.attn.qkv.weight", "backbone.branch2.blocks.4.attn.proj.weight", "backbone.branch2.blocks.4.mlp.fc1.weight", "backbone.branch2.blocks.4.mlp.fc2.weight", "backbone.branch3.blocks.4.attn.qkv.weight", "backbone.branch3.blocks.4.attn.proj.weight", "backbone.branch3.blocks.4.mlp.fc1.weight", "backbone.branch3.blocks.4.mlp.fc2.weight" ], "lr_scale": 0.08735421910125167, "lr": 8.735421910125167e-06, "weight_decay": 0.05 }, "layer_11_no_decay": { "param_names": [ "backbone.branch1.blocks.10.gamma_1", "backbone.branch1.blocks.10.gamma_2", "backbone.branch1.blocks.10.norm1.weight", "backbone.branch1.blocks.10.norm1.bias", "backbone.branch1.blocks.10.attn.qkv.bias", "backbone.branch1.blocks.10.attn.proj.bias", "backbone.branch1.blocks.10.norm2.weight", "backbone.branch1.blocks.10.norm2.bias", "backbone.branch1.blocks.10.mlp.fc1.bias", "backbone.branch1.blocks.10.mlp.fc2.bias" ], "lr_scale": 0.10276966953088432, "lr": 1.0276966953088432e-05, "weight_decay": 0.0 }, "layer_11_decay": { "param_names": [ "backbone.branch1.blocks.10.attn.qkv.weight", "backbone.branch1.blocks.10.attn.proj.weight", "backbone.branch1.blocks.10.mlp.fc1.weight", "backbone.branch1.blocks.10.mlp.fc2.weight" ], "lr_scale": 0.10276966953088432, "lr": 1.0276966953088432e-05, "weight_decay": 0.05 }, "layer_12_no_decay": { "param_names": [ "backbone.branch1.blocks.11.gamma_1", "backbone.branch1.blocks.11.gamma_2", "backbone.branch1.blocks.11.norm1.weight", "backbone.branch1.blocks.11.norm1.bias", "backbone.branch1.blocks.11.attn.qkv.bias", "backbone.branch1.blocks.11.attn.proj.bias", "backbone.branch1.blocks.11.norm2.weight", "backbone.branch1.blocks.11.norm2.bias", "backbone.branch1.blocks.11.mlp.fc1.bias", "backbone.branch1.blocks.11.mlp.fc2.bias", "backbone.branch2.blocks.5.gamma_1", "backbone.branch2.blocks.5.gamma_2", "backbone.branch2.blocks.5.norm1.weight", "backbone.branch2.blocks.5.norm1.bias", "backbone.branch2.blocks.5.attn.qkv.bias", "backbone.branch2.blocks.5.attn.proj.bias", "backbone.branch2.blocks.5.norm2.weight", "backbone.branch2.blocks.5.norm2.bias", "backbone.branch2.blocks.5.mlp.fc1.bias", "backbone.branch2.blocks.5.mlp.fc2.bias", "backbone.branch3.blocks.5.gamma_1", "backbone.branch3.blocks.5.gamma_2", "backbone.branch3.blocks.5.norm1.weight", "backbone.branch3.blocks.5.norm1.bias", "backbone.branch3.blocks.5.attn.qkv.bias", "backbone.branch3.blocks.5.attn.proj.bias", "backbone.branch3.blocks.5.norm2.weight", "backbone.branch3.blocks.5.norm2.bias", "backbone.branch3.blocks.5.mlp.fc1.bias", "backbone.branch3.blocks.5.mlp.fc2.bias" ], "lr_scale": 0.12090549356574626, "lr": 1.2090549356574626e-05, "weight_decay": 0.0 }, "layer_12_decay": { "param_names": [ "backbone.branch1.blocks.11.attn.qkv.weight", "backbone.branch1.blocks.11.attn.proj.weight", "backbone.branch1.blocks.11.mlp.fc1.weight", "backbone.branch1.blocks.11.mlp.fc2.weight", "backbone.branch2.blocks.5.attn.qkv.weight", "backbone.branch2.blocks.5.attn.proj.weight", "backbone.branch2.blocks.5.mlp.fc1.weight", "backbone.branch2.blocks.5.mlp.fc2.weight", "backbone.branch3.blocks.5.attn.qkv.weight", "backbone.branch3.blocks.5.attn.proj.weight", "backbone.branch3.blocks.5.mlp.fc1.weight", "backbone.branch3.blocks.5.mlp.fc2.weight" ], "lr_scale": 0.12090549356574626, "lr": 1.2090549356574626e-05, "weight_decay": 0.05 }, "layer_13_no_decay": { "param_names": [ "backbone.branch1.blocks.12.gamma_1", "backbone.branch1.blocks.12.gamma_2", "backbone.branch1.blocks.12.norm1.weight", "backbone.branch1.blocks.12.norm1.bias", "backbone.branch1.blocks.12.attn.qkv.bias", "backbone.branch1.blocks.12.attn.proj.bias", "backbone.branch1.blocks.12.norm2.weight", "backbone.branch1.blocks.12.norm2.bias", "backbone.branch1.blocks.12.mlp.fc1.bias", "backbone.branch1.blocks.12.mlp.fc2.bias" ], "lr_scale": 0.14224175713617207, "lr": 1.4224175713617208e-05, "weight_decay": 0.0 }, "layer_13_decay": { "param_names": [ "backbone.branch1.blocks.12.attn.qkv.weight", "backbone.branch1.blocks.12.attn.proj.weight", "backbone.branch1.blocks.12.mlp.fc1.weight", "backbone.branch1.blocks.12.mlp.fc2.weight" ], "lr_scale": 0.14224175713617207, "lr": 1.4224175713617208e-05, "weight_decay": 0.05 }, "layer_14_no_decay": { "param_names": [ "backbone.branch1.blocks.13.gamma_1", "backbone.branch1.blocks.13.gamma_2", "backbone.branch1.blocks.13.norm1.weight", "backbone.branch1.blocks.13.norm1.bias", "backbone.branch1.blocks.13.attn.qkv.bias", "backbone.branch1.blocks.13.attn.proj.bias", "backbone.branch1.blocks.13.norm2.weight", "backbone.branch1.blocks.13.norm2.bias", "backbone.branch1.blocks.13.mlp.fc1.bias", "backbone.branch1.blocks.13.mlp.fc2.bias", "backbone.branch2.blocks.6.gamma_1", "backbone.branch2.blocks.6.gamma_2", "backbone.branch2.blocks.6.norm1.weight", "backbone.branch2.blocks.6.norm1.bias", "backbone.branch2.blocks.6.attn.qkv.bias", "backbone.branch2.blocks.6.attn.proj.bias", "backbone.branch2.blocks.6.norm2.weight", "backbone.branch2.blocks.6.norm2.bias", "backbone.branch2.blocks.6.mlp.fc1.bias", "backbone.branch2.blocks.6.mlp.fc2.bias", "backbone.branch3.blocks.6.gamma_1", "backbone.branch3.blocks.6.gamma_2", "backbone.branch3.blocks.6.norm1.weight", "backbone.branch3.blocks.6.norm1.bias", "backbone.branch3.blocks.6.attn.qkv.bias", "backbone.branch3.blocks.6.attn.proj.bias", "backbone.branch3.blocks.6.norm2.weight", "backbone.branch3.blocks.6.norm2.bias", "backbone.branch3.blocks.6.mlp.fc1.bias", "backbone.branch3.blocks.6.mlp.fc2.bias" ], "lr_scale": 0.1673432436896142, "lr": 1.673432436896142e-05, "weight_decay": 0.0 }, "layer_14_decay": { "param_names": [ "backbone.branch1.blocks.13.attn.qkv.weight", "backbone.branch1.blocks.13.attn.proj.weight", "backbone.branch1.blocks.13.mlp.fc1.weight", "backbone.branch1.blocks.13.mlp.fc2.weight", "backbone.branch2.blocks.6.attn.qkv.weight", "backbone.branch2.blocks.6.attn.proj.weight", "backbone.branch2.blocks.6.mlp.fc1.weight", "backbone.branch2.blocks.6.mlp.fc2.weight", "backbone.branch3.blocks.6.attn.qkv.weight", "backbone.branch3.blocks.6.attn.proj.weight", "backbone.branch3.blocks.6.mlp.fc1.weight", "backbone.branch3.blocks.6.mlp.fc2.weight" ], "lr_scale": 0.1673432436896142, "lr": 1.673432436896142e-05, "weight_decay": 0.05 }, "layer_15_no_decay": { "param_names": [ "backbone.branch1.blocks.14.gamma_1", "backbone.branch1.blocks.14.gamma_2", "backbone.branch1.blocks.14.norm1.weight", "backbone.branch1.blocks.14.norm1.bias", "backbone.branch1.blocks.14.attn.qkv.bias", "backbone.branch1.blocks.14.attn.proj.bias", "backbone.branch1.blocks.14.norm2.weight", "backbone.branch1.blocks.14.norm2.bias", "backbone.branch1.blocks.14.mlp.fc1.bias", "backbone.branch1.blocks.14.mlp.fc2.bias" ], "lr_scale": 0.1968744043407226, "lr": 1.968744043407226e-05, "weight_decay": 0.0 }, "layer_15_decay": { "param_names": [ "backbone.branch1.blocks.14.attn.qkv.weight", "backbone.branch1.blocks.14.attn.proj.weight", "backbone.branch1.blocks.14.mlp.fc1.weight", "backbone.branch1.blocks.14.mlp.fc2.weight" ], "lr_scale": 0.1968744043407226, "lr": 1.968744043407226e-05, "weight_decay": 0.05 }, "layer_16_no_decay": { "param_names": [ "backbone.branch1.blocks.15.gamma_1", "backbone.branch1.blocks.15.gamma_2", "backbone.branch1.blocks.15.norm1.weight", "backbone.branch1.blocks.15.norm1.bias", "backbone.branch1.blocks.15.attn.qkv.bias", "backbone.branch1.blocks.15.attn.proj.bias", "backbone.branch1.blocks.15.norm2.weight", "backbone.branch1.blocks.15.norm2.bias", "backbone.branch1.blocks.15.mlp.fc1.bias", "backbone.branch1.blocks.15.mlp.fc2.bias", "backbone.branch2.blocks.7.gamma_1", "backbone.branch2.blocks.7.gamma_2", "backbone.branch2.blocks.7.norm1.weight", "backbone.branch2.blocks.7.norm1.bias", "backbone.branch2.blocks.7.attn.qkv.bias", "backbone.branch2.blocks.7.attn.proj.bias", "backbone.branch2.blocks.7.norm2.weight", "backbone.branch2.blocks.7.norm2.bias", "backbone.branch2.blocks.7.mlp.fc1.bias", "backbone.branch2.blocks.7.mlp.fc2.bias", "backbone.branch3.blocks.7.gamma_1", "backbone.branch3.blocks.7.gamma_2", "backbone.branch3.blocks.7.norm1.weight", "backbone.branch3.blocks.7.norm1.bias", "backbone.branch3.blocks.7.attn.qkv.bias", "backbone.branch3.blocks.7.attn.proj.bias", "backbone.branch3.blocks.7.norm2.weight", "backbone.branch3.blocks.7.norm2.bias", "backbone.branch3.blocks.7.mlp.fc1.bias", "backbone.branch3.blocks.7.mlp.fc2.bias" ], "lr_scale": 0.23161694628320306, "lr": 2.3161694628320308e-05, "weight_decay": 0.0 }, "layer_16_decay": { "param_names": [ "backbone.branch1.blocks.15.attn.qkv.weight", "backbone.branch1.blocks.15.attn.proj.weight", "backbone.branch1.blocks.15.mlp.fc1.weight", "backbone.branch1.blocks.15.mlp.fc2.weight", "backbone.branch2.blocks.7.attn.qkv.weight", "backbone.branch2.blocks.7.attn.proj.weight", "backbone.branch2.blocks.7.mlp.fc1.weight", "backbone.branch2.blocks.7.mlp.fc2.weight", "backbone.branch3.blocks.7.attn.qkv.weight", "backbone.branch3.blocks.7.attn.proj.weight", "backbone.branch3.blocks.7.mlp.fc1.weight", "backbone.branch3.blocks.7.mlp.fc2.weight" ], "lr_scale": 0.23161694628320306, "lr": 2.3161694628320308e-05, "weight_decay": 0.05 }, "layer_17_no_decay": { "param_names": [ "backbone.branch1.blocks.16.gamma_1", "backbone.branch1.blocks.16.gamma_2", "backbone.branch1.blocks.16.norm1.weight", "backbone.branch1.blocks.16.norm1.bias", "backbone.branch1.blocks.16.attn.qkv.bias", "backbone.branch1.blocks.16.attn.proj.bias", "backbone.branch1.blocks.16.norm2.weight", "backbone.branch1.blocks.16.norm2.bias", "backbone.branch1.blocks.16.mlp.fc1.bias", "backbone.branch1.blocks.16.mlp.fc2.bias" ], "lr_scale": 0.27249052503906246, "lr": 2.7249052503906248e-05, "weight_decay": 0.0 }, "layer_17_decay": { "param_names": [ "backbone.branch1.blocks.16.attn.qkv.weight", "backbone.branch1.blocks.16.attn.proj.weight", "backbone.branch1.blocks.16.mlp.fc1.weight", "backbone.branch1.blocks.16.mlp.fc2.weight" ], "lr_scale": 0.27249052503906246, "lr": 2.7249052503906248e-05, "weight_decay": 0.05 }, "layer_18_no_decay": { "param_names": [ "backbone.branch1.blocks.17.gamma_1", "backbone.branch1.blocks.17.gamma_2", "backbone.branch1.blocks.17.norm1.weight", "backbone.branch1.blocks.17.norm1.bias", "backbone.branch1.blocks.17.attn.qkv.bias", "backbone.branch1.blocks.17.attn.proj.bias", "backbone.branch1.blocks.17.norm2.weight", "backbone.branch1.blocks.17.norm2.bias", "backbone.branch1.blocks.17.mlp.fc1.bias", "backbone.branch1.blocks.17.mlp.fc2.bias", "backbone.branch2.blocks.8.gamma_1", "backbone.branch2.blocks.8.gamma_2", "backbone.branch2.blocks.8.norm1.weight", "backbone.branch2.blocks.8.norm1.bias", "backbone.branch2.blocks.8.attn.qkv.bias", "backbone.branch2.blocks.8.attn.proj.bias", "backbone.branch2.blocks.8.norm2.weight", "backbone.branch2.blocks.8.norm2.bias", "backbone.branch2.blocks.8.mlp.fc1.bias", "backbone.branch2.blocks.8.mlp.fc2.bias", "backbone.branch3.blocks.8.gamma_1", "backbone.branch3.blocks.8.gamma_2", "backbone.branch3.blocks.8.norm1.weight", "backbone.branch3.blocks.8.norm1.bias", "backbone.branch3.blocks.8.attn.qkv.bias", "backbone.branch3.blocks.8.attn.proj.bias", "backbone.branch3.blocks.8.norm2.weight", "backbone.branch3.blocks.8.norm2.bias", "backbone.branch3.blocks.8.mlp.fc1.bias", "backbone.branch3.blocks.8.mlp.fc2.bias" ], "lr_scale": 0.3205770882812499, "lr": 3.2057708828124995e-05, "weight_decay": 0.0 }, "layer_18_decay": { "param_names": [ "backbone.branch1.blocks.17.attn.qkv.weight", "backbone.branch1.blocks.17.attn.proj.weight", "backbone.branch1.blocks.17.mlp.fc1.weight", "backbone.branch1.blocks.17.mlp.fc2.weight", "backbone.branch2.blocks.8.attn.qkv.weight", "backbone.branch2.blocks.8.attn.proj.weight", "backbone.branch2.blocks.8.mlp.fc1.weight", "backbone.branch2.blocks.8.mlp.fc2.weight", "backbone.branch3.blocks.8.attn.qkv.weight", "backbone.branch3.blocks.8.attn.proj.weight", "backbone.branch3.blocks.8.mlp.fc1.weight", "backbone.branch3.blocks.8.mlp.fc2.weight" ], "lr_scale": 0.3205770882812499, "lr": 3.2057708828124995e-05, "weight_decay": 0.05 }, "layer_19_no_decay": { "param_names": [ "backbone.branch1.blocks.18.gamma_1", "backbone.branch1.blocks.18.gamma_2", "backbone.branch1.blocks.18.norm1.weight", "backbone.branch1.blocks.18.norm1.bias", "backbone.branch1.blocks.18.attn.qkv.bias", "backbone.branch1.blocks.18.attn.proj.bias", "backbone.branch1.blocks.18.norm2.weight", "backbone.branch1.blocks.18.norm2.bias", "backbone.branch1.blocks.18.mlp.fc1.bias", "backbone.branch1.blocks.18.mlp.fc2.bias" ], "lr_scale": 0.37714951562499993, "lr": 3.77149515625e-05, "weight_decay": 0.0 }, "layer_19_decay": { "param_names": [ "backbone.branch1.blocks.18.attn.qkv.weight", "backbone.branch1.blocks.18.attn.proj.weight", "backbone.branch1.blocks.18.mlp.fc1.weight", "backbone.branch1.blocks.18.mlp.fc2.weight" ], "lr_scale": 0.37714951562499993, "lr": 3.77149515625e-05, "weight_decay": 0.05 }, "layer_20_no_decay": { "param_names": [ "backbone.branch1.blocks.19.gamma_1", "backbone.branch1.blocks.19.gamma_2", "backbone.branch1.blocks.19.norm1.weight", "backbone.branch1.blocks.19.norm1.bias", "backbone.branch1.blocks.19.attn.qkv.bias", "backbone.branch1.blocks.19.attn.proj.bias", "backbone.branch1.blocks.19.norm2.weight", "backbone.branch1.blocks.19.norm2.bias", "backbone.branch1.blocks.19.mlp.fc1.bias", "backbone.branch1.blocks.19.mlp.fc2.bias", "backbone.branch2.blocks.9.gamma_1", "backbone.branch2.blocks.9.gamma_2", "backbone.branch2.blocks.9.norm1.weight", "backbone.branch2.blocks.9.norm1.bias", "backbone.branch2.blocks.9.attn.qkv.bias", "backbone.branch2.blocks.9.attn.proj.bias", "backbone.branch2.blocks.9.norm2.weight", "backbone.branch2.blocks.9.norm2.bias", "backbone.branch2.blocks.9.mlp.fc1.bias", "backbone.branch2.blocks.9.mlp.fc2.bias", "backbone.branch3.blocks.9.gamma_1", "backbone.branch3.blocks.9.gamma_2", "backbone.branch3.blocks.9.norm1.weight", "backbone.branch3.blocks.9.norm1.bias", "backbone.branch3.blocks.9.attn.qkv.bias", "backbone.branch3.blocks.9.attn.proj.bias", "backbone.branch3.blocks.9.norm2.weight", "backbone.branch3.blocks.9.norm2.bias", "backbone.branch3.blocks.9.mlp.fc1.bias", "backbone.branch3.blocks.9.mlp.fc2.bias" ], "lr_scale": 0.44370531249999995, "lr": 4.4370531249999995e-05, "weight_decay": 0.0 }, "layer_20_decay": { "param_names": [ "backbone.branch1.blocks.19.attn.qkv.weight", "backbone.branch1.blocks.19.attn.proj.weight", "backbone.branch1.blocks.19.mlp.fc1.weight", "backbone.branch1.blocks.19.mlp.fc2.weight", "backbone.branch2.blocks.9.attn.qkv.weight", "backbone.branch2.blocks.9.attn.proj.weight", "backbone.branch2.blocks.9.mlp.fc1.weight", "backbone.branch2.blocks.9.mlp.fc2.weight", "backbone.branch3.blocks.9.attn.qkv.weight", "backbone.branch3.blocks.9.attn.proj.weight", "backbone.branch3.blocks.9.mlp.fc1.weight", "backbone.branch3.blocks.9.mlp.fc2.weight" ], "lr_scale": 0.44370531249999995, "lr": 4.4370531249999995e-05, "weight_decay": 0.05 }, "layer_21_no_decay": { "param_names": [ "backbone.branch1.blocks.20.gamma_1", "backbone.branch1.blocks.20.gamma_2", "backbone.branch1.blocks.20.norm1.weight", "backbone.branch1.blocks.20.norm1.bias", "backbone.branch1.blocks.20.attn.qkv.bias", "backbone.branch1.blocks.20.attn.proj.bias", "backbone.branch1.blocks.20.norm2.weight", "backbone.branch1.blocks.20.norm2.bias", "backbone.branch1.blocks.20.mlp.fc1.bias", "backbone.branch1.blocks.20.mlp.fc2.bias" ], "lr_scale": 0.5220062499999999, "lr": 5.220062499999999e-05, "weight_decay": 0.0 }, "layer_21_decay": { "param_names": [ "backbone.branch1.blocks.20.attn.qkv.weight", "backbone.branch1.blocks.20.attn.proj.weight", "backbone.branch1.blocks.20.mlp.fc1.weight", "backbone.branch1.blocks.20.mlp.fc2.weight" ], "lr_scale": 0.5220062499999999, "lr": 5.220062499999999e-05, "weight_decay": 0.05 }, "layer_22_no_decay": { "param_names": [ "backbone.branch1.blocks.21.gamma_1", "backbone.branch1.blocks.21.gamma_2", "backbone.branch1.blocks.21.norm1.weight", "backbone.branch1.blocks.21.norm1.bias", "backbone.branch1.blocks.21.attn.qkv.bias", "backbone.branch1.blocks.21.attn.proj.bias", "backbone.branch1.blocks.21.norm2.weight", "backbone.branch1.blocks.21.norm2.bias", "backbone.branch1.blocks.21.mlp.fc1.bias", "backbone.branch1.blocks.21.mlp.fc2.bias", "backbone.branch2.blocks.10.gamma_1", "backbone.branch2.blocks.10.gamma_2", "backbone.branch2.blocks.10.norm1.weight", "backbone.branch2.blocks.10.norm1.bias", "backbone.branch2.blocks.10.attn.qkv.bias", "backbone.branch2.blocks.10.attn.proj.bias", "backbone.branch2.blocks.10.norm2.weight", "backbone.branch2.blocks.10.norm2.bias", "backbone.branch2.blocks.10.mlp.fc1.bias", "backbone.branch2.blocks.10.mlp.fc2.bias", "backbone.branch3.blocks.10.gamma_1", "backbone.branch3.blocks.10.gamma_2", "backbone.branch3.blocks.10.norm1.weight", "backbone.branch3.blocks.10.norm1.bias", "backbone.branch3.blocks.10.attn.qkv.bias", "backbone.branch3.blocks.10.attn.proj.bias", "backbone.branch3.blocks.10.norm2.weight", "backbone.branch3.blocks.10.norm2.bias", "backbone.branch3.blocks.10.mlp.fc1.bias", "backbone.branch3.blocks.10.mlp.fc2.bias" ], "lr_scale": 0.6141249999999999, "lr": 6.14125e-05, "weight_decay": 0.0 }, "layer_22_decay": { "param_names": [ "backbone.branch1.blocks.21.attn.qkv.weight", "backbone.branch1.blocks.21.attn.proj.weight", "backbone.branch1.blocks.21.mlp.fc1.weight", "backbone.branch1.blocks.21.mlp.fc2.weight", "backbone.branch2.blocks.10.attn.qkv.weight", "backbone.branch2.blocks.10.attn.proj.weight", "backbone.branch2.blocks.10.mlp.fc1.weight", "backbone.branch2.blocks.10.mlp.fc2.weight", "backbone.branch3.blocks.10.attn.qkv.weight", "backbone.branch3.blocks.10.attn.proj.weight", "backbone.branch3.blocks.10.mlp.fc1.weight", "backbone.branch3.blocks.10.mlp.fc2.weight" ], "lr_scale": 0.6141249999999999, "lr": 6.14125e-05, "weight_decay": 0.05 }, "layer_23_no_decay": { "param_names": [ "backbone.branch1.blocks.22.gamma_1", "backbone.branch1.blocks.22.gamma_2", "backbone.branch1.blocks.22.norm1.weight", "backbone.branch1.blocks.22.norm1.bias", "backbone.branch1.blocks.22.attn.qkv.bias", "backbone.branch1.blocks.22.attn.proj.bias", "backbone.branch1.blocks.22.norm2.weight", "backbone.branch1.blocks.22.norm2.bias", "backbone.branch1.blocks.22.mlp.fc1.bias", "backbone.branch1.blocks.22.mlp.fc2.bias" ], "lr_scale": 0.7224999999999999, "lr": 7.225e-05, "weight_decay": 0.0 }, "layer_23_decay": { "param_names": [ "backbone.branch1.blocks.22.attn.qkv.weight", "backbone.branch1.blocks.22.attn.proj.weight", "backbone.branch1.blocks.22.mlp.fc1.weight", "backbone.branch1.blocks.22.mlp.fc2.weight" ], "lr_scale": 0.7224999999999999, "lr": 7.225e-05, "weight_decay": 0.05 }, "layer_24_no_decay": { "param_names": [ "backbone.branch1.blocks.23.gamma_1", "backbone.branch1.blocks.23.gamma_2", "backbone.branch1.blocks.23.norm1.weight", "backbone.branch1.blocks.23.norm1.bias", "backbone.branch1.blocks.23.attn.qkv.bias", "backbone.branch1.blocks.23.attn.proj.bias", "backbone.branch1.blocks.23.norm2.weight", "backbone.branch1.blocks.23.norm2.bias", "backbone.branch1.blocks.23.mlp.fc1.bias", "backbone.branch1.blocks.23.mlp.fc2.bias", "backbone.branch2.blocks.11.gamma_1", "backbone.branch2.blocks.11.gamma_2", "backbone.branch2.blocks.11.norm1.weight", "backbone.branch2.blocks.11.norm1.bias", "backbone.branch2.blocks.11.attn.qkv.bias", "backbone.branch2.blocks.11.attn.proj.bias", "backbone.branch2.blocks.11.norm2.weight", "backbone.branch2.blocks.11.norm2.bias", "backbone.branch2.blocks.11.mlp.fc1.bias", "backbone.branch2.blocks.11.mlp.fc2.bias", "backbone.branch3.blocks.11.gamma_1", "backbone.branch3.blocks.11.gamma_2", "backbone.branch3.blocks.11.norm1.weight", "backbone.branch3.blocks.11.norm1.bias", "backbone.branch3.blocks.11.attn.qkv.bias", "backbone.branch3.blocks.11.attn.proj.bias", "backbone.branch3.blocks.11.norm2.weight", "backbone.branch3.blocks.11.norm2.bias", "backbone.branch3.blocks.11.mlp.fc1.bias", "backbone.branch3.blocks.11.mlp.fc2.bias" ], "lr_scale": 0.85, "lr": 8.5e-05, "weight_decay": 0.0 }, "layer_24_decay": { "param_names": [ "backbone.branch1.blocks.23.attn.qkv.weight", "backbone.branch1.blocks.23.attn.proj.weight", "backbone.branch1.blocks.23.mlp.fc1.weight", "backbone.branch1.blocks.23.mlp.fc2.weight", "backbone.branch2.blocks.11.attn.qkv.weight", "backbone.branch2.blocks.11.attn.proj.weight", "backbone.branch2.blocks.11.mlp.fc1.weight", "backbone.branch2.blocks.11.mlp.fc2.weight", "backbone.branch3.blocks.11.attn.qkv.weight", "backbone.branch3.blocks.11.attn.proj.weight", "backbone.branch3.blocks.11.mlp.fc1.weight", "backbone.branch3.blocks.11.mlp.fc2.weight" ], "lr_scale": 0.85, "lr": 8.5e-05, "weight_decay": 0.05 }, "layer_25_no_decay": { "param_names": [ "backbone.interactions.0.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.merge_branch1.1.weight", "backbone.merge_branch1.1.bias", "backbone.merge_branch1.4.weight", "backbone.merge_branch1.4.bias", "backbone.merge_branch2.1.weight", "backbone.merge_branch2.1.bias", "backbone.merge_branch2.4.weight", "backbone.merge_branch2.4.bias", "backbone.merge_branch3.1.weight", "backbone.merge_branch3.1.bias", "backbone.merge_branch3.4.weight", "backbone.merge_branch3.4.bias", "backbone.fpn1.0.bias", "backbone.fpn1.1.weight", "backbone.fpn1.1.bias", "backbone.fpn1.3.bias", "backbone.fpn2.0.bias", "neck.lateral_convs.0.conv.bias", "neck.lateral_convs.1.conv.bias", "neck.lateral_convs.2.conv.bias", "neck.lateral_convs.3.conv.bias", "neck.fpn_convs.0.conv.bias", "neck.fpn_convs.1.conv.bias", "neck.fpn_convs.2.conv.bias", "neck.fpn_convs.3.conv.bias", "rpn_head.rpn_conv.bias", "rpn_head.rpn_cls.bias", "rpn_head.rpn_reg.bias", "roi_head.bbox_head.fc_cls.bias", "roi_head.bbox_head.fc_reg.bias", "roi_head.bbox_head.shared_fcs.0.bias", "roi_head.bbox_head.shared_fcs.1.bias", "roi_head.mask_head.convs.0.conv.bias", "roi_head.mask_head.convs.1.conv.bias", "roi_head.mask_head.convs.2.conv.bias", "roi_head.mask_head.convs.3.conv.bias", "roi_head.mask_head.upsample.bias", "roi_head.mask_head.conv_logits.bias" ], "lr_scale": 1.0, "lr": 0.0001, "weight_decay": 0.0 } } 2024-05-31 22:25:00,651 - mmdet - INFO - Automatic scaling of learning rate (LR) has been disabled. 2024-05-31 22:25:01,051 - mmdet - INFO - Start running, work_dir: /mnt/petrelfs/PIIP/mmdetection/work_dirs/mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16 2024-05-31 22:25:01,052 - mmdet - INFO - Hooks will be executed in the following order: before_run: (VERY_HIGH ) StepLrUpdaterHook (49 ) ToBFloat16HookMMDet (NORMAL ) DeepspeedCheckpointHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_epoch: (VERY_HIGH ) StepLrUpdaterHook (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_iter: (VERY_HIGH ) StepLrUpdaterHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook -------------------- after_train_iter: (ABOVE_NORMAL) OptimizerHook (NORMAL ) DeepspeedCheckpointHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- after_train_epoch: (NORMAL ) DeepspeedCheckpointHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_val_epoch: (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (VERY_LOW ) TextLoggerHook -------------------- before_val_iter: (LOW ) IterTimerHook -------------------- after_val_iter: (LOW ) IterTimerHook -------------------- after_val_epoch: (VERY_LOW ) TextLoggerHook -------------------- after_run: (VERY_LOW ) TextLoggerHook -------------------- 2024-05-31 22:25:01,052 - mmdet - INFO - workflow: [('train', 1)], max: 12 epochs 2024-05-31 22:25:01,063 - mmdet - INFO - Checkpoints will be saved to /mnt/petrelfs/PIIP/mmdetection/work_dirs/mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16 by HardDiskBackend. 2024-05-31 22:25:39,849 - mmdet - INFO - Epoch [1][50/7330] lr: 9.890e-06, eta: 18:56:14, time: 0.776, data_time: 0.100, memory: 12891, loss_rpn_cls: 0.6380, loss_rpn_bbox: 0.1098, loss_cls: 1.7534, acc: 67.2178, loss_bbox: 0.0269, loss_mask: 1.1897, loss: 3.7178 2024-05-31 22:26:10,580 - mmdet - INFO - Epoch [1][100/7330] lr: 1.988e-05, eta: 16:57:47, time: 0.615, data_time: 0.032, memory: 12987, loss_rpn_cls: 0.3232, loss_rpn_bbox: 0.1103, loss_cls: 0.3791, acc: 95.2695, loss_bbox: 0.1500, loss_mask: 0.7569, loss: 1.7196 2024-05-31 22:26:40,932 - mmdet - INFO - Epoch [1][150/7330] lr: 2.987e-05, eta: 16:14:14, time: 0.607, data_time: 0.029, memory: 12987, loss_rpn_cls: 0.2632, loss_rpn_bbox: 0.1025, loss_cls: 0.3586, acc: 94.7927, loss_bbox: 0.1709, loss_mask: 0.7025, loss: 1.5978 2024-05-31 22:27:11,501 - mmdet - INFO - Epoch [1][200/7330] lr: 3.986e-05, eta: 15:53:51, time: 0.611, data_time: 0.035, memory: 12987, loss_rpn_cls: 0.2459, loss_rpn_bbox: 0.1002, loss_cls: 0.3458, acc: 94.6465, loss_bbox: 0.1796, loss_mask: 0.6877, loss: 1.5592 2024-05-31 22:27:47,016 - mmdet - INFO - Epoch [1][250/7330] lr: 4.985e-05, eta: 16:10:18, time: 0.710, data_time: 0.030, memory: 13166, loss_rpn_cls: 0.1994, loss_rpn_bbox: 0.0950, loss_cls: 0.3630, acc: 94.1243, loss_bbox: 0.2033, loss_mask: 0.6748, loss: 1.5356 2024-05-31 22:28:18,001 - mmdet - INFO - Epoch [1][300/7330] lr: 5.984e-05, eta: 15:59:02, time: 0.620, data_time: 0.040, memory: 13166, loss_rpn_cls: 0.1771, loss_rpn_bbox: 0.1039, loss_cls: 0.4382, acc: 92.8164, loss_bbox: 0.2538, loss_mask: 0.6594, loss: 1.6325 2024-05-31 22:28:48,865 - mmdet - INFO - Epoch [1][350/7330] lr: 6.983e-05, eta: 15:50:18, time: 0.617, data_time: 0.035, memory: 13234, loss_rpn_cls: 0.1447, loss_rpn_bbox: 0.1018, loss_cls: 0.4357, acc: 92.4468, loss_bbox: 0.2705, loss_mask: 0.6367, loss: 1.5894 2024-05-31 22:29:19,986 - mmdet - INFO - Epoch [1][400/7330] lr: 7.982e-05, eta: 15:44:35, time: 0.622, data_time: 0.030, memory: 13234, loss_rpn_cls: 0.1268, loss_rpn_bbox: 0.0922, loss_cls: 0.4439, acc: 91.9236, loss_bbox: 0.2951, loss_mask: 0.6045, loss: 1.5625 2024-05-31 22:29:50,780 - mmdet - INFO - Epoch [1][450/7330] lr: 8.981e-05, eta: 15:38:56, time: 0.616, data_time: 0.025, memory: 13234, loss_rpn_cls: 0.1127, loss_rpn_bbox: 0.0888, loss_cls: 0.4313, acc: 91.6487, loss_bbox: 0.3073, loss_mask: 0.5813, loss: 1.5214 2024-05-31 22:30:22,060 - mmdet - INFO - Epoch [1][500/7330] lr: 9.980e-05, eta: 15:35:46, time: 0.626, data_time: 0.035, memory: 13234, loss_rpn_cls: 0.1138, loss_rpn_bbox: 0.0956, loss_cls: 0.4382, acc: 91.0247, loss_bbox: 0.3260, loss_mask: 0.5646, loss: 1.5382 2024-05-31 22:30:53,051 - mmdet - INFO - Epoch [1][550/7330] lr: 1.000e-04, eta: 15:32:18, time: 0.620, data_time: 0.032, memory: 13252, loss_rpn_cls: 0.1132, loss_rpn_bbox: 0.0943, loss_cls: 0.4137, acc: 91.0823, loss_bbox: 0.3223, loss_mask: 0.5307, loss: 1.4740 2024-05-31 22:31:24,331 - mmdet - INFO - Epoch [1][600/7330] lr: 1.000e-04, eta: 15:30:01, time: 0.626, data_time: 0.034, memory: 13252, loss_rpn_cls: 0.1071, loss_rpn_bbox: 0.0954, loss_cls: 0.4185, acc: 90.5264, loss_bbox: 0.3425, loss_mask: 0.5181, loss: 1.4817 2024-05-31 22:31:55,447 - mmdet - INFO - Epoch [1][650/7330] lr: 1.000e-04, eta: 15:27:39, time: 0.622, data_time: 0.026, memory: 13252, loss_rpn_cls: 0.0920, loss_rpn_bbox: 0.0838, loss_cls: 0.3911, acc: 90.7278, loss_bbox: 0.3388, loss_mask: 0.5082, loss: 1.4140 2024-05-31 22:32:29,370 - mmdet - INFO - Epoch [1][700/7330] lr: 1.000e-04, eta: 15:31:22, time: 0.678, data_time: 0.034, memory: 13252, loss_rpn_cls: 0.0988, loss_rpn_bbox: 0.0923, loss_cls: 0.4105, acc: 90.0750, loss_bbox: 0.3557, loss_mask: 0.4876, loss: 1.4449 2024-05-31 22:33:00,732 - mmdet - INFO - Epoch [1][750/7330] lr: 1.000e-04, eta: 15:29:33, time: 0.627, data_time: 0.032, memory: 13256, loss_rpn_cls: 0.0858, loss_rpn_bbox: 0.0899, loss_cls: 0.3807, acc: 90.5034, loss_bbox: 0.3419, loss_mask: 0.4747, loss: 1.3731 2024-05-31 22:33:31,972 - mmdet - INFO - Epoch [1][800/7330] lr: 1.000e-04, eta: 15:27:41, time: 0.625, data_time: 0.030, memory: 13274, loss_rpn_cls: 0.0844, loss_rpn_bbox: 0.0833, loss_cls: 0.3789, acc: 90.4326, loss_bbox: 0.3428, loss_mask: 0.4729, loss: 1.3623 2024-05-31 22:34:03,371 - mmdet - INFO - Epoch [1][850/7330] lr: 1.000e-04, eta: 15:26:15, time: 0.628, data_time: 0.028, memory: 13274, loss_rpn_cls: 0.0826, loss_rpn_bbox: 0.0854, loss_cls: 0.3660, acc: 90.4543, loss_bbox: 0.3448, loss_mask: 0.4631, loss: 1.3418 2024-05-31 22:34:34,843 - mmdet - INFO - Epoch [1][900/7330] lr: 1.000e-04, eta: 15:25:01, time: 0.629, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0870, loss_rpn_bbox: 0.0852, loss_cls: 0.3721, acc: 90.1365, loss_bbox: 0.3591, loss_mask: 0.4429, loss: 1.3464 2024-05-31 22:35:06,056 - mmdet - INFO - Epoch [1][950/7330] lr: 1.000e-04, eta: 15:23:29, time: 0.624, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0821, loss_rpn_bbox: 0.0833, loss_cls: 0.3770, acc: 89.9380, loss_bbox: 0.3667, loss_mask: 0.4509, loss: 1.3600 2024-05-31 22:35:37,324 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-05-31 22:35:37,324 - mmdet - INFO - Epoch [1][1000/7330] lr: 1.000e-04, eta: 15:22:07, time: 0.625, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0837, loss_rpn_bbox: 0.0864, loss_cls: 0.3645, acc: 89.8418, loss_bbox: 0.3649, loss_mask: 0.4374, loss: 1.3368 2024-05-31 22:36:08,760 - mmdet - INFO - Epoch [1][1050/7330] lr: 1.000e-04, eta: 15:21:04, time: 0.629, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0802, loss_rpn_bbox: 0.0865, loss_cls: 0.3592, acc: 89.9712, loss_bbox: 0.3577, loss_mask: 0.4417, loss: 1.3253 2024-05-31 22:36:40,421 - mmdet - INFO - Epoch [1][1100/7330] lr: 1.000e-04, eta: 15:20:22, time: 0.633, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0821, loss_rpn_bbox: 0.0864, loss_cls: 0.3599, acc: 90.1018, loss_bbox: 0.3564, loss_mask: 0.4380, loss: 1.3228 2024-05-31 22:37:22,437 - mmdet - INFO - Epoch [1][1150/7330] lr: 1.000e-04, eta: 15:32:42, time: 0.840, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0792, loss_rpn_bbox: 0.0785, loss_cls: 0.3494, acc: 90.3110, loss_bbox: 0.3516, loss_mask: 0.4321, loss: 1.2907 2024-05-31 22:37:53,886 - mmdet - INFO - Epoch [1][1200/7330] lr: 1.000e-04, eta: 15:31:13, time: 0.629, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0761, loss_rpn_bbox: 0.0795, loss_cls: 0.3554, acc: 90.0425, loss_bbox: 0.3596, loss_mask: 0.4289, loss: 1.2996 2024-05-31 22:38:25,248 - mmdet - INFO - Epoch [1][1250/7330] lr: 1.000e-04, eta: 15:29:42, time: 0.627, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0745, loss_rpn_bbox: 0.0825, loss_cls: 0.3484, acc: 90.0105, loss_bbox: 0.3639, loss_mask: 0.4187, loss: 1.2880 2024-05-31 22:38:56,871 - mmdet - INFO - Epoch [1][1300/7330] lr: 1.000e-04, eta: 15:28:34, time: 0.633, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0757, loss_rpn_bbox: 0.0805, loss_cls: 0.3400, acc: 90.3611, loss_bbox: 0.3483, loss_mask: 0.4126, loss: 1.2572 2024-05-31 22:39:28,721 - mmdet - INFO - Epoch [1][1350/7330] lr: 1.000e-04, eta: 15:27:43, time: 0.637, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0718, loss_rpn_bbox: 0.0803, loss_cls: 0.3694, acc: 89.3008, loss_bbox: 0.3901, loss_mask: 0.4179, loss: 1.3295 2024-05-31 22:40:00,311 - mmdet - INFO - Epoch [1][1400/7330] lr: 1.000e-04, eta: 15:26:37, time: 0.632, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0680, loss_rpn_bbox: 0.0838, loss_cls: 0.3485, acc: 89.8147, loss_bbox: 0.3715, loss_mask: 0.4132, loss: 1.2850 2024-05-31 22:40:31,786 - mmdet - INFO - Epoch [1][1450/7330] lr: 1.000e-04, eta: 15:25:27, time: 0.629, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0716, loss_rpn_bbox: 0.0785, loss_cls: 0.3352, acc: 90.2034, loss_bbox: 0.3520, loss_mask: 0.4055, loss: 1.2428 2024-05-31 22:41:03,539 - mmdet - INFO - Epoch [1][1500/7330] lr: 1.000e-04, eta: 15:24:35, time: 0.635, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0688, loss_rpn_bbox: 0.0781, loss_cls: 0.3472, acc: 89.8169, loss_bbox: 0.3685, loss_mask: 0.4094, loss: 1.2719 2024-05-31 22:41:35,129 - mmdet - INFO - Epoch [1][1550/7330] lr: 1.000e-04, eta: 15:23:36, time: 0.632, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0708, loss_rpn_bbox: 0.0810, loss_cls: 0.3323, acc: 90.1013, loss_bbox: 0.3599, loss_mask: 0.4028, loss: 1.2467 2024-05-31 22:42:09,417 - mmdet - INFO - Epoch [1][1600/7330] lr: 1.000e-04, eta: 15:25:04, time: 0.686, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0656, loss_rpn_bbox: 0.0773, loss_cls: 0.3318, acc: 90.2080, loss_bbox: 0.3487, loss_mask: 0.3875, loss: 1.2109 2024-05-31 22:42:40,815 - mmdet - INFO - Epoch [1][1650/7330] lr: 1.000e-04, eta: 15:23:53, time: 0.628, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0666, loss_rpn_bbox: 0.0739, loss_cls: 0.3185, acc: 90.4272, loss_bbox: 0.3441, loss_mask: 0.3853, loss: 1.1884 2024-05-31 22:43:12,365 - mmdet - INFO - Epoch [1][1700/7330] lr: 1.000e-04, eta: 15:22:52, time: 0.631, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0606, loss_rpn_bbox: 0.0720, loss_cls: 0.3287, acc: 90.3342, loss_bbox: 0.3488, loss_mask: 0.3926, loss: 1.2028 2024-05-31 22:43:43,995 - mmdet - INFO - Epoch [1][1750/7330] lr: 1.000e-04, eta: 15:21:57, time: 0.633, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0611, loss_rpn_bbox: 0.0751, loss_cls: 0.3285, acc: 90.1387, loss_bbox: 0.3521, loss_mask: 0.3876, loss: 1.2044 2024-05-31 22:44:15,538 - mmdet - INFO - Epoch [1][1800/7330] lr: 1.000e-04, eta: 15:20:59, time: 0.631, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0588, loss_rpn_bbox: 0.0789, loss_cls: 0.3275, acc: 90.2400, loss_bbox: 0.3545, loss_mask: 0.3911, loss: 1.2108 2024-05-31 22:44:47,003 - mmdet - INFO - Epoch [1][1850/7330] lr: 1.000e-04, eta: 15:19:59, time: 0.629, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0652, loss_rpn_bbox: 0.0788, loss_cls: 0.3313, acc: 89.9521, loss_bbox: 0.3569, loss_mask: 0.3887, loss: 1.2211 2024-05-31 22:45:18,736 - mmdet - INFO - Epoch [1][1900/7330] lr: 1.000e-04, eta: 15:19:12, time: 0.635, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0787, loss_cls: 0.3369, acc: 89.8042, loss_bbox: 0.3640, loss_mask: 0.3862, loss: 1.2251 2024-05-31 22:45:49,978 - mmdet - INFO - Epoch [1][1950/7330] lr: 1.000e-04, eta: 15:18:05, time: 0.625, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0573, loss_rpn_bbox: 0.0738, loss_cls: 0.3140, acc: 90.4861, loss_bbox: 0.3427, loss_mask: 0.3756, loss: 1.1635 2024-05-31 22:46:21,598 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-05-31 22:46:21,598 - mmdet - INFO - Epoch [1][2000/7330] lr: 1.000e-04, eta: 15:17:16, time: 0.632, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0618, loss_rpn_bbox: 0.0749, loss_cls: 0.3263, acc: 89.7939, loss_bbox: 0.3632, loss_mask: 0.3797, loss: 1.2061 2024-05-31 22:46:53,185 - mmdet - INFO - Epoch [1][2050/7330] lr: 1.000e-04, eta: 15:16:26, time: 0.632, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0606, loss_rpn_bbox: 0.0730, loss_cls: 0.3202, acc: 90.1499, loss_bbox: 0.3452, loss_mask: 0.3771, loss: 1.1761 2024-05-31 22:47:34,081 - mmdet - INFO - Epoch [1][2100/7330] lr: 1.000e-04, eta: 15:21:57, time: 0.818, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0562, loss_rpn_bbox: 0.0705, loss_cls: 0.3321, acc: 89.8123, loss_bbox: 0.3649, loss_mask: 0.3763, loss: 1.2000 2024-05-31 22:48:05,848 - mmdet - INFO - Epoch [1][2150/7330] lr: 1.000e-04, eta: 15:21:07, time: 0.635, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0592, loss_rpn_bbox: 0.0734, loss_cls: 0.3306, acc: 89.9119, loss_bbox: 0.3632, loss_mask: 0.3723, loss: 1.1988 2024-05-31 22:48:37,473 - mmdet - INFO - Epoch [1][2200/7330] lr: 1.000e-04, eta: 15:20:13, time: 0.633, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0729, loss_cls: 0.3222, acc: 90.1487, loss_bbox: 0.3559, loss_mask: 0.3713, loss: 1.1793 2024-05-31 22:49:09,539 - mmdet - INFO - Epoch [1][2250/7330] lr: 1.000e-04, eta: 15:19:36, time: 0.641, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0619, loss_rpn_bbox: 0.0762, loss_cls: 0.3270, acc: 89.9802, loss_bbox: 0.3568, loss_mask: 0.3681, loss: 1.1900 2024-05-31 22:49:41,081 - mmdet - INFO - Epoch [1][2300/7330] lr: 1.000e-04, eta: 15:18:39, time: 0.631, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0512, loss_rpn_bbox: 0.0693, loss_cls: 0.3149, acc: 90.2183, loss_bbox: 0.3518, loss_mask: 0.3668, loss: 1.1539 2024-05-31 22:50:12,685 - mmdet - INFO - Epoch [1][2350/7330] lr: 1.000e-04, eta: 15:17:47, time: 0.632, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0559, loss_rpn_bbox: 0.0724, loss_cls: 0.3144, acc: 90.0884, loss_bbox: 0.3546, loss_mask: 0.3640, loss: 1.1613 2024-05-31 22:50:44,356 - mmdet - INFO - Epoch [1][2400/7330] lr: 1.000e-04, eta: 15:16:57, time: 0.633, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0566, loss_rpn_bbox: 0.0730, loss_cls: 0.3071, acc: 90.4407, loss_bbox: 0.3437, loss_mask: 0.3667, loss: 1.1471 2024-05-31 22:51:16,300 - mmdet - INFO - Epoch [1][2450/7330] lr: 1.000e-04, eta: 15:16:17, time: 0.638, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0557, loss_rpn_bbox: 0.0713, loss_cls: 0.3131, acc: 90.1692, loss_bbox: 0.3588, loss_mask: 0.3604, loss: 1.1592 2024-05-31 22:51:49,759 - mmdet - INFO - Epoch [1][2500/7330] lr: 1.000e-04, eta: 15:16:30, time: 0.670, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0484, loss_rpn_bbox: 0.0669, loss_cls: 0.3014, acc: 90.6304, loss_bbox: 0.3380, loss_mask: 0.3576, loss: 1.1124 2024-05-31 22:52:21,518 - mmdet - INFO - Epoch [1][2550/7330] lr: 1.000e-04, eta: 15:15:44, time: 0.635, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0584, loss_rpn_bbox: 0.0759, loss_cls: 0.3196, acc: 89.8953, loss_bbox: 0.3589, loss_mask: 0.3582, loss: 1.1710 2024-05-31 22:52:53,163 - mmdet - INFO - Epoch [1][2600/7330] lr: 1.000e-04, eta: 15:14:55, time: 0.633, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0574, loss_rpn_bbox: 0.0730, loss_cls: 0.3173, acc: 90.2705, loss_bbox: 0.3469, loss_mask: 0.3622, loss: 1.1567 2024-05-31 22:53:24,689 - mmdet - INFO - Epoch [1][2650/7330] lr: 1.000e-04, eta: 15:14:02, time: 0.631, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0504, loss_rpn_bbox: 0.0684, loss_cls: 0.3006, acc: 90.5769, loss_bbox: 0.3433, loss_mask: 0.3635, loss: 1.1263 2024-05-31 22:53:56,018 - mmdet - INFO - Epoch [1][2700/7330] lr: 1.000e-04, eta: 15:13:05, time: 0.627, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0582, loss_rpn_bbox: 0.0722, loss_cls: 0.3113, acc: 90.3257, loss_bbox: 0.3438, loss_mask: 0.3567, loss: 1.1422 2024-05-31 22:54:27,715 - mmdet - INFO - Epoch [1][2750/7330] lr: 1.000e-04, eta: 15:12:19, time: 0.634, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0543, loss_rpn_bbox: 0.0686, loss_cls: 0.3072, acc: 90.2449, loss_bbox: 0.3521, loss_mask: 0.3556, loss: 1.1378 2024-05-31 22:54:59,357 - mmdet - INFO - Epoch [1][2800/7330] lr: 1.000e-04, eta: 15:11:32, time: 0.633, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0506, loss_rpn_bbox: 0.0681, loss_cls: 0.3028, acc: 90.4558, loss_bbox: 0.3386, loss_mask: 0.3480, loss: 1.1080 2024-05-31 22:55:30,994 - mmdet - INFO - Epoch [1][2850/7330] lr: 1.000e-04, eta: 15:10:46, time: 0.633, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0551, loss_rpn_bbox: 0.0702, loss_cls: 0.3080, acc: 90.4795, loss_bbox: 0.3431, loss_mask: 0.3523, loss: 1.1288 2024-05-31 22:56:02,634 - mmdet - INFO - Epoch [1][2900/7330] lr: 1.000e-04, eta: 15:10:00, time: 0.633, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0500, loss_rpn_bbox: 0.0650, loss_cls: 0.2965, acc: 90.5784, loss_bbox: 0.3371, loss_mask: 0.3520, loss: 1.1005 2024-05-31 22:56:34,489 - mmdet - INFO - Epoch [1][2950/7330] lr: 1.000e-04, eta: 15:09:21, time: 0.637, data_time: 0.045, memory: 13277, loss_rpn_cls: 0.0531, loss_rpn_bbox: 0.0728, loss_cls: 0.3088, acc: 90.1975, loss_bbox: 0.3506, loss_mask: 0.3463, loss: 1.1315 2024-05-31 22:57:16,506 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-05-31 22:57:16,506 - mmdet - INFO - Epoch [1][3000/7330] lr: 1.000e-04, eta: 15:13:30, time: 0.840, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0525, loss_rpn_bbox: 0.0731, loss_cls: 0.2908, acc: 90.6917, loss_bbox: 0.3345, loss_mask: 0.3438, loss: 1.0946 2024-05-31 22:57:48,077 - mmdet - INFO - Epoch [1][3050/7330] lr: 1.000e-04, eta: 15:12:39, time: 0.631, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0524, loss_rpn_bbox: 0.0693, loss_cls: 0.2990, acc: 90.4575, loss_bbox: 0.3410, loss_mask: 0.3504, loss: 1.1121 2024-05-31 22:58:19,678 - mmdet - INFO - Epoch [1][3100/7330] lr: 1.000e-04, eta: 15:11:49, time: 0.632, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0530, loss_rpn_bbox: 0.0670, loss_cls: 0.2896, acc: 90.6787, loss_bbox: 0.3402, loss_mask: 0.3396, loss: 1.0894 2024-05-31 22:58:51,330 - mmdet - INFO - Epoch [1][3150/7330] lr: 1.000e-04, eta: 15:11:01, time: 0.633, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0513, loss_rpn_bbox: 0.0655, loss_cls: 0.3057, acc: 90.3298, loss_bbox: 0.3451, loss_mask: 0.3412, loss: 1.1088 2024-05-31 22:59:23,053 - mmdet - INFO - Epoch [1][3200/7330] lr: 1.000e-04, eta: 15:10:16, time: 0.634, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0480, loss_rpn_bbox: 0.0684, loss_cls: 0.2943, acc: 90.4648, loss_bbox: 0.3387, loss_mask: 0.3450, loss: 1.0944 2024-05-31 22:59:54,593 - mmdet - INFO - Epoch [1][3250/7330] lr: 1.000e-04, eta: 15:09:26, time: 0.631, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0523, loss_rpn_bbox: 0.0664, loss_cls: 0.2877, acc: 90.7771, loss_bbox: 0.3330, loss_mask: 0.3319, loss: 1.0712 2024-05-31 23:00:26,294 - mmdet - INFO - Epoch [1][3300/7330] lr: 1.000e-04, eta: 15:08:40, time: 0.634, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0494, loss_rpn_bbox: 0.0650, loss_cls: 0.2968, acc: 90.5220, loss_bbox: 0.3453, loss_mask: 0.3379, loss: 1.0945 2024-05-31 23:00:58,039 - mmdet - INFO - Epoch [1][3350/7330] lr: 1.000e-04, eta: 15:07:57, time: 0.635, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0476, loss_rpn_bbox: 0.0681, loss_cls: 0.2995, acc: 90.3220, loss_bbox: 0.3370, loss_mask: 0.3386, loss: 1.0908 2024-05-31 23:01:29,738 - mmdet - INFO - Epoch [1][3400/7330] lr: 1.000e-04, eta: 15:07:12, time: 0.634, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0490, loss_rpn_bbox: 0.0686, loss_cls: 0.3012, acc: 90.5269, loss_bbox: 0.3362, loss_mask: 0.3417, loss: 1.0968 2024-05-31 23:02:03,228 - mmdet - INFO - Epoch [1][3450/7330] lr: 1.000e-04, eta: 15:07:12, time: 0.670, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0511, loss_rpn_bbox: 0.0696, loss_cls: 0.2888, acc: 90.5718, loss_bbox: 0.3426, loss_mask: 0.3374, loss: 1.0894 2024-05-31 23:02:34,936 - mmdet - INFO - Epoch [1][3500/7330] lr: 1.000e-04, eta: 15:06:28, time: 0.634, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0513, loss_rpn_bbox: 0.0657, loss_cls: 0.2955, acc: 90.6860, loss_bbox: 0.3356, loss_mask: 0.3304, loss: 1.0784 2024-05-31 23:03:06,576 - mmdet - INFO - Epoch [1][3550/7330] lr: 1.000e-04, eta: 15:05:42, time: 0.633, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0473, loss_rpn_bbox: 0.0666, loss_cls: 0.2887, acc: 90.6538, loss_bbox: 0.3394, loss_mask: 0.3353, loss: 1.0774 2024-05-31 23:03:38,329 - mmdet - INFO - Epoch [1][3600/7330] lr: 1.000e-04, eta: 15:05:00, time: 0.635, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0500, loss_rpn_bbox: 0.0693, loss_cls: 0.3025, acc: 90.4568, loss_bbox: 0.3376, loss_mask: 0.3297, loss: 1.0892 2024-05-31 23:04:09,913 - mmdet - INFO - Epoch [1][3650/7330] lr: 1.000e-04, eta: 15:04:14, time: 0.632, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0492, loss_rpn_bbox: 0.0678, loss_cls: 0.2838, acc: 90.8723, loss_bbox: 0.3275, loss_mask: 0.3319, loss: 1.0602 2024-05-31 23:04:41,750 - mmdet - INFO - Epoch [1][3700/7330] lr: 1.000e-04, eta: 15:03:34, time: 0.637, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0481, loss_rpn_bbox: 0.0667, loss_cls: 0.2976, acc: 90.2859, loss_bbox: 0.3519, loss_mask: 0.3395, loss: 1.1038 2024-05-31 23:05:13,129 - mmdet - INFO - Epoch [1][3750/7330] lr: 1.000e-04, eta: 15:02:44, time: 0.628, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0467, loss_rpn_bbox: 0.0633, loss_cls: 0.2855, acc: 90.7346, loss_bbox: 0.3326, loss_mask: 0.3304, loss: 1.0583 2024-05-31 23:05:44,858 - mmdet - INFO - Epoch [1][3800/7330] lr: 1.000e-04, eta: 15:02:02, time: 0.635, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0483, loss_rpn_bbox: 0.0649, loss_cls: 0.2979, acc: 90.5107, loss_bbox: 0.3367, loss_mask: 0.3354, loss: 1.0832 2024-05-31 23:06:16,907 - mmdet - INFO - Epoch [1][3850/7330] lr: 1.000e-04, eta: 15:01:28, time: 0.641, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0508, loss_rpn_bbox: 0.0653, loss_cls: 0.2854, acc: 90.6919, loss_bbox: 0.3334, loss_mask: 0.3285, loss: 1.0635 2024-05-31 23:06:58,348 - mmdet - INFO - Epoch [1][3900/7330] lr: 1.000e-04, eta: 15:04:16, time: 0.829, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0499, loss_rpn_bbox: 0.0660, loss_cls: 0.2911, acc: 90.5305, loss_bbox: 0.3383, loss_mask: 0.3326, loss: 1.0781 2024-05-31 23:07:30,090 - mmdet - INFO - Epoch [1][3950/7330] lr: 1.000e-04, eta: 15:03:32, time: 0.635, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0496, loss_rpn_bbox: 0.0624, loss_cls: 0.2878, acc: 90.7112, loss_bbox: 0.3394, loss_mask: 0.3313, loss: 1.0705 2024-05-31 23:08:02,097 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-05-31 23:08:02,097 - mmdet - INFO - Epoch [1][4000/7330] lr: 1.000e-04, eta: 15:02:55, time: 0.640, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0513, loss_rpn_bbox: 0.0672, loss_cls: 0.2924, acc: 90.4175, loss_bbox: 0.3454, loss_mask: 0.3356, loss: 1.0919 2024-05-31 23:08:33,646 - mmdet - INFO - Epoch [1][4050/7330] lr: 1.000e-04, eta: 15:02:08, time: 0.631, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0473, loss_rpn_bbox: 0.0646, loss_cls: 0.2818, acc: 90.8032, loss_bbox: 0.3310, loss_mask: 0.3400, loss: 1.0647 2024-05-31 23:09:05,154 - mmdet - INFO - Epoch [1][4100/7330] lr: 1.000e-04, eta: 15:01:20, time: 0.630, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0478, loss_rpn_bbox: 0.0665, loss_cls: 0.2850, acc: 90.8093, loss_bbox: 0.3289, loss_mask: 0.3326, loss: 1.0609 2024-05-31 23:09:36,907 - mmdet - INFO - Epoch [1][4150/7330] lr: 1.000e-04, eta: 15:00:38, time: 0.635, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0468, loss_rpn_bbox: 0.0651, loss_cls: 0.2886, acc: 90.5608, loss_bbox: 0.3372, loss_mask: 0.3272, loss: 1.0649 2024-05-31 23:10:08,782 - mmdet - INFO - Epoch [1][4200/7330] lr: 1.000e-04, eta: 14:59:58, time: 0.637, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0455, loss_rpn_bbox: 0.0628, loss_cls: 0.2919, acc: 90.5959, loss_bbox: 0.3392, loss_mask: 0.3273, loss: 1.0666 2024-05-31 23:10:40,296 - mmdet - INFO - Epoch [1][4250/7330] lr: 1.000e-04, eta: 14:59:12, time: 0.630, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0477, loss_rpn_bbox: 0.0640, loss_cls: 0.2848, acc: 90.8928, loss_bbox: 0.3249, loss_mask: 0.3260, loss: 1.0475 2024-05-31 23:11:12,059 - mmdet - INFO - Epoch [1][4300/7330] lr: 1.000e-04, eta: 14:58:31, time: 0.635, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0500, loss_rpn_bbox: 0.0694, loss_cls: 0.3017, acc: 90.2263, loss_bbox: 0.3471, loss_mask: 0.3304, loss: 1.0986 2024-05-31 23:11:45,876 - mmdet - INFO - Epoch [1][4350/7330] lr: 1.000e-04, eta: 14:58:29, time: 0.676, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0450, loss_rpn_bbox: 0.0664, loss_cls: 0.2905, acc: 90.5142, loss_bbox: 0.3376, loss_mask: 0.3213, loss: 1.0608 2024-05-31 23:12:17,626 - mmdet - INFO - Epoch [1][4400/7330] lr: 1.000e-04, eta: 14:57:48, time: 0.635, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0456, loss_rpn_bbox: 0.0657, loss_cls: 0.2896, acc: 90.6340, loss_bbox: 0.3373, loss_mask: 0.3259, loss: 1.0641 2024-05-31 23:12:49,304 - mmdet - INFO - Epoch [1][4450/7330] lr: 1.000e-04, eta: 14:57:05, time: 0.634, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0451, loss_rpn_bbox: 0.0648, loss_cls: 0.2893, acc: 90.5920, loss_bbox: 0.3357, loss_mask: 0.3254, loss: 1.0603 2024-05-31 23:13:21,201 - mmdet - INFO - Epoch [1][4500/7330] lr: 1.000e-04, eta: 14:56:27, time: 0.638, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0467, loss_rpn_bbox: 0.0621, loss_cls: 0.2793, acc: 90.7935, loss_bbox: 0.3262, loss_mask: 0.3190, loss: 1.0334 2024-05-31 23:13:52,846 - mmdet - INFO - Epoch [1][4550/7330] lr: 1.000e-04, eta: 14:55:43, time: 0.632, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0415, loss_rpn_bbox: 0.0636, loss_cls: 0.2861, acc: 90.5740, loss_bbox: 0.3357, loss_mask: 0.3282, loss: 1.0551 2024-05-31 23:14:24,758 - mmdet - INFO - Epoch [1][4600/7330] lr: 1.000e-04, eta: 14:55:06, time: 0.639, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0496, loss_rpn_bbox: 0.0682, loss_cls: 0.2931, acc: 90.3015, loss_bbox: 0.3466, loss_mask: 0.3340, loss: 1.0915 2024-05-31 23:14:56,397 - mmdet - INFO - Epoch [1][4650/7330] lr: 1.000e-04, eta: 14:54:23, time: 0.633, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0412, loss_rpn_bbox: 0.0628, loss_cls: 0.2730, acc: 91.0610, loss_bbox: 0.3244, loss_mask: 0.3200, loss: 1.0214 2024-05-31 23:15:28,009 - mmdet - INFO - Epoch [1][4700/7330] lr: 1.000e-04, eta: 14:53:41, time: 0.632, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0452, loss_rpn_bbox: 0.0625, loss_cls: 0.2844, acc: 90.7590, loss_bbox: 0.3277, loss_mask: 0.3209, loss: 1.0406 2024-05-31 23:15:59,683 - mmdet - INFO - Epoch [1][4750/7330] lr: 1.000e-04, eta: 14:52:59, time: 0.633, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0463, loss_rpn_bbox: 0.0631, loss_cls: 0.2817, acc: 90.7297, loss_bbox: 0.3361, loss_mask: 0.3251, loss: 1.0523 2024-05-31 23:16:41,523 - mmdet - INFO - Epoch [1][4800/7330] lr: 1.000e-04, eta: 14:55:14, time: 0.837, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0462, loss_rpn_bbox: 0.0667, loss_cls: 0.2838, acc: 90.8535, loss_bbox: 0.3294, loss_mask: 0.3267, loss: 1.0527 2024-05-31 23:17:13,432 - mmdet - INFO - Epoch [1][4850/7330] lr: 1.000e-04, eta: 14:54:35, time: 0.638, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0455, loss_rpn_bbox: 0.0640, loss_cls: 0.2779, acc: 90.9741, loss_bbox: 0.3247, loss_mask: 0.3153, loss: 1.0274 2024-05-31 23:17:45,219 - mmdet - INFO - Epoch [1][4900/7330] lr: 1.000e-04, eta: 14:53:54, time: 0.636, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0496, loss_rpn_bbox: 0.0633, loss_cls: 0.2843, acc: 90.9150, loss_bbox: 0.3204, loss_mask: 0.3202, loss: 1.0379 2024-05-31 23:18:16,747 - mmdet - INFO - Epoch [1][4950/7330] lr: 1.000e-04, eta: 14:53:09, time: 0.631, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0435, loss_rpn_bbox: 0.0595, loss_cls: 0.2740, acc: 91.0181, loss_bbox: 0.3266, loss_mask: 0.3185, loss: 1.0221 2024-05-31 23:18:48,434 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-05-31 23:18:48,434 - mmdet - INFO - Epoch [1][5000/7330] lr: 1.000e-04, eta: 14:52:27, time: 0.634, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0441, loss_rpn_bbox: 0.0600, loss_cls: 0.2764, acc: 91.0232, loss_bbox: 0.3261, loss_mask: 0.3224, loss: 1.0291 2024-05-31 23:19:20,267 - mmdet - INFO - Epoch [1][5050/7330] lr: 1.000e-04, eta: 14:51:48, time: 0.637, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0452, loss_rpn_bbox: 0.0636, loss_cls: 0.2823, acc: 90.7461, loss_bbox: 0.3311, loss_mask: 0.3198, loss: 1.0421 2024-05-31 23:19:52,202 - mmdet - INFO - Epoch [1][5100/7330] lr: 1.000e-04, eta: 14:51:10, time: 0.639, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0489, loss_rpn_bbox: 0.0656, loss_cls: 0.2921, acc: 90.4167, loss_bbox: 0.3421, loss_mask: 0.3196, loss: 1.0682 2024-05-31 23:20:24,059 - mmdet - INFO - Epoch [1][5150/7330] lr: 1.000e-04, eta: 14:50:31, time: 0.637, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0450, loss_rpn_bbox: 0.0637, loss_cls: 0.2830, acc: 90.6458, loss_bbox: 0.3333, loss_mask: 0.3213, loss: 1.0462 2024-05-31 23:20:56,003 - mmdet - INFO - Epoch [1][5200/7330] lr: 1.000e-04, eta: 14:49:54, time: 0.639, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0477, loss_rpn_bbox: 0.0685, loss_cls: 0.2928, acc: 90.3159, loss_bbox: 0.3425, loss_mask: 0.3195, loss: 1.0711 2024-05-31 23:21:29,526 - mmdet - INFO - Epoch [1][5250/7330] lr: 1.000e-04, eta: 14:49:42, time: 0.670, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0453, loss_rpn_bbox: 0.0631, loss_cls: 0.2764, acc: 91.0242, loss_bbox: 0.3255, loss_mask: 0.3157, loss: 1.0260 2024-05-31 23:22:01,362 - mmdet - INFO - Epoch [1][5300/7330] lr: 1.000e-04, eta: 14:49:02, time: 0.637, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0443, loss_rpn_bbox: 0.0646, loss_cls: 0.2709, acc: 91.0256, loss_bbox: 0.3244, loss_mask: 0.3232, loss: 1.0273 2024-05-31 23:22:33,145 - mmdet - INFO - Epoch [1][5350/7330] lr: 1.000e-04, eta: 14:48:23, time: 0.636, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0430, loss_rpn_bbox: 0.0645, loss_cls: 0.2794, acc: 90.7063, loss_bbox: 0.3325, loss_mask: 0.3161, loss: 1.0355 2024-05-31 23:23:05,008 - mmdet - INFO - Epoch [1][5400/7330] lr: 1.000e-04, eta: 14:47:44, time: 0.637, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0446, loss_rpn_bbox: 0.0659, loss_cls: 0.2722, acc: 91.0200, loss_bbox: 0.3250, loss_mask: 0.3220, loss: 1.0297 2024-05-31 23:23:36,789 - mmdet - INFO - Epoch [1][5450/7330] lr: 1.000e-04, eta: 14:47:05, time: 0.636, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0463, loss_rpn_bbox: 0.0611, loss_cls: 0.2804, acc: 90.7083, loss_bbox: 0.3282, loss_mask: 0.3199, loss: 1.0358 2024-05-31 23:24:08,341 - mmdet - INFO - Epoch [1][5500/7330] lr: 1.000e-04, eta: 14:46:22, time: 0.631, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0433, loss_rpn_bbox: 0.0601, loss_cls: 0.2633, acc: 91.1501, loss_bbox: 0.3117, loss_mask: 0.3157, loss: 0.9942 2024-05-31 23:24:39,960 - mmdet - INFO - Epoch [1][5550/7330] lr: 1.000e-04, eta: 14:45:40, time: 0.632, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0418, loss_rpn_bbox: 0.0570, loss_cls: 0.2689, acc: 91.3113, loss_bbox: 0.3092, loss_mask: 0.3122, loss: 0.9890 2024-05-31 23:25:11,554 - mmdet - INFO - Epoch [1][5600/7330] lr: 1.000e-04, eta: 14:44:59, time: 0.632, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0388, loss_rpn_bbox: 0.0612, loss_cls: 0.2672, acc: 91.2971, loss_bbox: 0.3105, loss_mask: 0.3115, loss: 0.9892 2024-05-31 23:25:43,309 - mmdet - INFO - Epoch [1][5650/7330] lr: 1.000e-04, eta: 14:44:20, time: 0.635, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0451, loss_rpn_bbox: 0.0640, loss_cls: 0.2852, acc: 90.8035, loss_bbox: 0.3202, loss_mask: 0.3175, loss: 1.0320 2024-05-31 23:26:26,144 - mmdet - INFO - Epoch [1][5700/7330] lr: 1.000e-04, eta: 14:46:20, time: 0.857, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0456, loss_rpn_bbox: 0.0600, loss_cls: 0.2661, acc: 91.2415, loss_bbox: 0.3115, loss_mask: 0.3161, loss: 0.9991 2024-05-31 23:26:57,644 - mmdet - INFO - Epoch [1][5750/7330] lr: 1.000e-04, eta: 14:45:36, time: 0.630, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0426, loss_rpn_bbox: 0.0620, loss_cls: 0.2619, acc: 91.2144, loss_bbox: 0.3167, loss_mask: 0.3136, loss: 0.9967 2024-05-31 23:27:29,396 - mmdet - INFO - Epoch [1][5800/7330] lr: 1.000e-04, eta: 14:44:56, time: 0.635, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0381, loss_rpn_bbox: 0.0615, loss_cls: 0.2654, acc: 91.1899, loss_bbox: 0.3177, loss_mask: 0.3176, loss: 1.0004 2024-05-31 23:28:01,311 - mmdet - INFO - Epoch [1][5850/7330] lr: 1.000e-04, eta: 14:44:18, time: 0.638, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0430, loss_rpn_bbox: 0.0635, loss_cls: 0.2740, acc: 90.9287, loss_bbox: 0.3245, loss_mask: 0.3189, loss: 1.0239 2024-05-31 23:28:33,145 - mmdet - INFO - Epoch [1][5900/7330] lr: 1.000e-04, eta: 14:43:39, time: 0.637, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0469, loss_rpn_bbox: 0.0650, loss_cls: 0.2753, acc: 90.7063, loss_bbox: 0.3308, loss_mask: 0.3203, loss: 1.0383 2024-05-31 23:29:05,013 - mmdet - INFO - Epoch [1][5950/7330] lr: 1.000e-04, eta: 14:43:01, time: 0.637, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0443, loss_rpn_bbox: 0.0626, loss_cls: 0.2730, acc: 90.9050, loss_bbox: 0.3298, loss_mask: 0.3172, loss: 1.0269 2024-05-31 23:29:36,824 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-05-31 23:29:36,824 - mmdet - INFO - Epoch [1][6000/7330] lr: 1.000e-04, eta: 14:42:22, time: 0.636, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0415, loss_rpn_bbox: 0.0614, loss_cls: 0.2736, acc: 91.0376, loss_bbox: 0.3183, loss_mask: 0.3135, loss: 1.0082 2024-05-31 23:30:08,392 - mmdet - INFO - Epoch [1][6050/7330] lr: 1.000e-04, eta: 14:41:39, time: 0.631, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0435, loss_rpn_bbox: 0.0592, loss_cls: 0.2655, acc: 91.2939, loss_bbox: 0.3127, loss_mask: 0.3089, loss: 0.9897 2024-05-31 23:30:40,158 - mmdet - INFO - Epoch [1][6100/7330] lr: 1.000e-04, eta: 14:41:00, time: 0.635, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0448, loss_rpn_bbox: 0.0634, loss_cls: 0.2797, acc: 90.7112, loss_bbox: 0.3293, loss_mask: 0.3126, loss: 1.0299 2024-05-31 23:31:13,807 - mmdet - INFO - Epoch [1][6150/7330] lr: 1.000e-04, eta: 14:40:46, time: 0.673, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0433, loss_rpn_bbox: 0.0608, loss_cls: 0.2697, acc: 91.3059, loss_bbox: 0.3167, loss_mask: 0.3090, loss: 0.9996 2024-05-31 23:31:45,726 - mmdet - INFO - Epoch [1][6200/7330] lr: 1.000e-04, eta: 14:40:09, time: 0.638, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0401, loss_rpn_bbox: 0.0594, loss_cls: 0.2642, acc: 91.2026, loss_bbox: 0.3157, loss_mask: 0.3040, loss: 0.9834 2024-05-31 23:32:17,603 - mmdet - INFO - Epoch [1][6250/7330] lr: 1.000e-04, eta: 14:39:31, time: 0.638, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0416, loss_rpn_bbox: 0.0598, loss_cls: 0.2747, acc: 90.9082, loss_bbox: 0.3196, loss_mask: 0.3165, loss: 1.0122 2024-05-31 23:32:49,672 - mmdet - INFO - Epoch [1][6300/7330] lr: 1.000e-04, eta: 14:38:56, time: 0.641, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0417, loss_rpn_bbox: 0.0618, loss_cls: 0.2720, acc: 90.9312, loss_bbox: 0.3171, loss_mask: 0.3084, loss: 1.0010 2024-05-31 23:33:21,334 - mmdet - INFO - Epoch [1][6350/7330] lr: 1.000e-04, eta: 14:38:15, time: 0.633, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0404, loss_rpn_bbox: 0.0595, loss_cls: 0.2569, acc: 91.3533, loss_bbox: 0.3068, loss_mask: 0.3051, loss: 0.9687 2024-05-31 23:33:52,953 - mmdet - INFO - Epoch [1][6400/7330] lr: 1.000e-04, eta: 14:37:35, time: 0.632, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0429, loss_rpn_bbox: 0.0615, loss_cls: 0.2720, acc: 91.0142, loss_bbox: 0.3171, loss_mask: 0.3143, loss: 1.0078 2024-05-31 23:34:24,548 - mmdet - INFO - Epoch [1][6450/7330] lr: 1.000e-04, eta: 14:36:54, time: 0.632, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0386, loss_rpn_bbox: 0.0574, loss_cls: 0.2657, acc: 91.2195, loss_bbox: 0.3137, loss_mask: 0.3136, loss: 0.9891 2024-05-31 23:34:56,111 - mmdet - INFO - Epoch [1][6500/7330] lr: 1.000e-04, eta: 14:36:12, time: 0.631, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0469, loss_rpn_bbox: 0.0629, loss_cls: 0.2706, acc: 91.1150, loss_bbox: 0.3173, loss_mask: 0.3134, loss: 1.0111 2024-05-31 23:35:27,924 - mmdet - INFO - Epoch [1][6550/7330] lr: 1.000e-04, eta: 14:35:34, time: 0.636, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0424, loss_rpn_bbox: 0.0587, loss_cls: 0.2657, acc: 90.9795, loss_bbox: 0.3169, loss_mask: 0.3177, loss: 1.0014 2024-05-31 23:36:08,850 - mmdet - INFO - Epoch [1][6600/7330] lr: 1.000e-04, eta: 14:36:49, time: 0.819, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0392, loss_rpn_bbox: 0.0582, loss_cls: 0.2612, acc: 91.2163, loss_bbox: 0.3132, loss_mask: 0.3147, loss: 0.9866 2024-05-31 23:36:42,934 - mmdet - INFO - Epoch [1][6650/7330] lr: 1.000e-04, eta: 14:36:38, time: 0.682, data_time: 0.043, memory: 13277, loss_rpn_cls: 0.0415, loss_rpn_bbox: 0.0617, loss_cls: 0.2632, acc: 91.2405, loss_bbox: 0.3136, loss_mask: 0.3067, loss: 0.9866 2024-05-31 23:37:14,568 - mmdet - INFO - Epoch [1][6700/7330] lr: 1.000e-04, eta: 14:35:57, time: 0.633, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0407, loss_rpn_bbox: 0.0604, loss_cls: 0.2702, acc: 91.0820, loss_bbox: 0.3132, loss_mask: 0.3114, loss: 0.9960 2024-05-31 23:37:46,434 - mmdet - INFO - Epoch [1][6750/7330] lr: 1.000e-04, eta: 14:35:19, time: 0.637, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0397, loss_rpn_bbox: 0.0582, loss_cls: 0.2660, acc: 91.2180, loss_bbox: 0.3142, loss_mask: 0.3080, loss: 0.9860 2024-05-31 23:38:18,238 - mmdet - INFO - Epoch [1][6800/7330] lr: 1.000e-04, eta: 14:34:40, time: 0.637, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0400, loss_rpn_bbox: 0.0596, loss_cls: 0.2689, acc: 91.1252, loss_bbox: 0.3158, loss_mask: 0.3056, loss: 0.9899 2024-05-31 23:38:50,016 - mmdet - INFO - Epoch [1][6850/7330] lr: 1.000e-04, eta: 14:34:01, time: 0.636, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0410, loss_rpn_bbox: 0.0577, loss_cls: 0.2644, acc: 91.2791, loss_bbox: 0.3161, loss_mask: 0.3103, loss: 0.9896 2024-05-31 23:39:21,874 - mmdet - INFO - Epoch [1][6900/7330] lr: 1.000e-04, eta: 14:33:24, time: 0.637, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0423, loss_rpn_bbox: 0.0602, loss_cls: 0.2778, acc: 90.8606, loss_bbox: 0.3242, loss_mask: 0.3082, loss: 1.0127 2024-05-31 23:39:53,364 - mmdet - INFO - Epoch [1][6950/7330] lr: 1.000e-04, eta: 14:32:42, time: 0.630, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0409, loss_rpn_bbox: 0.0585, loss_cls: 0.2488, acc: 91.8186, loss_bbox: 0.2950, loss_mask: 0.3032, loss: 0.9464 2024-05-31 23:40:25,233 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-05-31 23:40:25,233 - mmdet - INFO - Epoch [1][7000/7330] lr: 1.000e-04, eta: 14:32:04, time: 0.637, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0410, loss_rpn_bbox: 0.0608, loss_cls: 0.2677, acc: 90.9651, loss_bbox: 0.3201, loss_mask: 0.3072, loss: 0.9969 2024-05-31 23:40:58,728 - mmdet - INFO - Epoch [1][7050/7330] lr: 1.000e-04, eta: 14:31:45, time: 0.669, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0385, loss_rpn_bbox: 0.0571, loss_cls: 0.2608, acc: 91.3611, loss_bbox: 0.3084, loss_mask: 0.2980, loss: 0.9628 2024-05-31 23:41:30,560 - mmdet - INFO - Epoch [1][7100/7330] lr: 1.000e-04, eta: 14:31:07, time: 0.637, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0422, loss_rpn_bbox: 0.0578, loss_cls: 0.2645, acc: 91.4202, loss_bbox: 0.3101, loss_mask: 0.3082, loss: 0.9828 2024-05-31 23:42:02,227 - mmdet - INFO - Epoch [1][7150/7330] lr: 1.000e-04, eta: 14:30:28, time: 0.633, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0378, loss_rpn_bbox: 0.0567, loss_cls: 0.2649, acc: 91.0271, loss_bbox: 0.3170, loss_mask: 0.3034, loss: 0.9797 2024-05-31 23:42:33,942 - mmdet - INFO - Epoch [1][7200/7330] lr: 1.000e-04, eta: 14:29:49, time: 0.634, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0421, loss_rpn_bbox: 0.0569, loss_cls: 0.2571, acc: 91.4255, loss_bbox: 0.2999, loss_mask: 0.2977, loss: 0.9536 2024-05-31 23:43:05,755 - mmdet - INFO - Epoch [1][7250/7330] lr: 1.000e-04, eta: 14:29:11, time: 0.636, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0390, loss_rpn_bbox: 0.0599, loss_cls: 0.2624, acc: 91.1931, loss_bbox: 0.3127, loss_mask: 0.3029, loss: 0.9770 2024-05-31 23:43:37,431 - mmdet - INFO - Epoch [1][7300/7330] lr: 1.000e-04, eta: 14:28:31, time: 0.634, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0454, loss_rpn_bbox: 0.0607, loss_cls: 0.2601, acc: 91.4158, loss_bbox: 0.3089, loss_mask: 0.3102, loss: 0.9854 2024-05-31 23:43:57,377 - mmdet - INFO - Saving checkpoint at 1 epochs 2024-05-31 23:45:39,909 - mmdet - INFO - Evaluating bbox... 2024-05-31 23:46:09,106 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.302 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.555 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.298 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.152 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.327 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.451 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.427 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.427 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.427 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.226 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.462 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.599 2024-05-31 23:46:09,106 - mmdet - INFO - Evaluating segm... 2024-05-31 23:46:41,503 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.287 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.508 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.289 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.091 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.306 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.503 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.396 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.396 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.396 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.166 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.438 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.609 2024-05-31 23:46:42,044 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-05-31 23:46:42,045 - mmdet - INFO - Epoch(val) [1][625] bbox_mAP: 0.3020, bbox_mAP_50: 0.5550, bbox_mAP_75: 0.2980, bbox_mAP_s: 0.1520, bbox_mAP_m: 0.3270, bbox_mAP_l: 0.4510, bbox_mAP_copypaste: 0.302 0.555 0.298 0.152 0.327 0.451, segm_mAP: 0.2870, segm_mAP_50: 0.5080, segm_mAP_75: 0.2890, segm_mAP_s: 0.0910, segm_mAP_m: 0.3060, segm_mAP_l: 0.5030, segm_mAP_copypaste: 0.287 0.508 0.289 0.091 0.306 0.503 2024-05-31 23:47:24,998 - mmdet - INFO - Epoch [2][50/7330] lr: 1.000e-04, eta: 14:26:04, time: 0.859, data_time: 0.097, memory: 13277, loss_rpn_cls: 0.0344, loss_rpn_bbox: 0.0544, loss_cls: 0.2512, acc: 91.6138, loss_bbox: 0.3015, loss_mask: 0.3039, loss: 0.9454 2024-05-31 23:47:57,155 - mmdet - INFO - Epoch [2][100/7330] lr: 1.000e-04, eta: 14:25:31, time: 0.643, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0387, loss_rpn_bbox: 0.0573, loss_cls: 0.2495, acc: 91.5442, loss_bbox: 0.3036, loss_mask: 0.3002, loss: 0.9494 2024-05-31 23:48:31,437 - mmdet - INFO - Epoch [2][150/7330] lr: 1.000e-04, eta: 14:25:21, time: 0.686, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0398, loss_rpn_bbox: 0.0626, loss_cls: 0.2651, acc: 90.8665, loss_bbox: 0.3248, loss_mask: 0.3061, loss: 0.9983 2024-05-31 23:49:03,498 - mmdet - INFO - Epoch [2][200/7330] lr: 1.000e-04, eta: 14:24:46, time: 0.641, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0374, loss_rpn_bbox: 0.0570, loss_cls: 0.2497, acc: 91.5679, loss_bbox: 0.3034, loss_mask: 0.3017, loss: 0.9493 2024-05-31 23:49:35,766 - mmdet - INFO - Epoch [2][250/7330] lr: 1.000e-04, eta: 14:24:14, time: 0.645, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0363, loss_rpn_bbox: 0.0559, loss_cls: 0.2577, acc: 91.3914, loss_bbox: 0.3103, loss_mask: 0.2995, loss: 0.9597 2024-05-31 23:50:07,503 - mmdet - INFO - Epoch [2][300/7330] lr: 1.000e-04, eta: 14:23:36, time: 0.635, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0357, loss_rpn_bbox: 0.0541, loss_cls: 0.2444, acc: 91.7366, loss_bbox: 0.2987, loss_mask: 0.2976, loss: 0.9305 2024-05-31 23:50:39,681 - mmdet - INFO - Epoch [2][350/7330] lr: 1.000e-04, eta: 14:23:03, time: 0.644, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0365, loss_rpn_bbox: 0.0565, loss_cls: 0.2610, acc: 91.3735, loss_bbox: 0.3098, loss_mask: 0.3010, loss: 0.9648 2024-05-31 23:51:11,082 - mmdet - INFO - Epoch [2][400/7330] lr: 1.000e-04, eta: 14:22:22, time: 0.628, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0329, loss_rpn_bbox: 0.0545, loss_cls: 0.2456, acc: 91.6880, loss_bbox: 0.2997, loss_mask: 0.2986, loss: 0.9312 2024-05-31 23:51:43,045 - mmdet - INFO - Epoch [2][450/7330] lr: 1.000e-04, eta: 14:21:47, time: 0.639, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0382, loss_rpn_bbox: 0.0568, loss_cls: 0.2477, acc: 91.6409, loss_bbox: 0.2943, loss_mask: 0.2948, loss: 0.9318 2024-05-31 23:52:15,415 - mmdet - INFO - Epoch [2][500/7330] lr: 1.000e-04, eta: 14:21:16, time: 0.647, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0398, loss_rpn_bbox: 0.0600, loss_cls: 0.2644, acc: 90.9558, loss_bbox: 0.3196, loss_mask: 0.3002, loss: 0.9840 2024-05-31 23:52:47,480 - mmdet - INFO - Epoch [2][550/7330] lr: 1.000e-04, eta: 14:20:42, time: 0.642, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0369, loss_rpn_bbox: 0.0582, loss_cls: 0.2485, acc: 91.6494, loss_bbox: 0.2994, loss_mask: 0.3005, loss: 0.9435 2024-05-31 23:53:19,953 - mmdet - INFO - Epoch [2][600/7330] lr: 1.000e-04, eta: 14:20:12, time: 0.649, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0363, loss_rpn_bbox: 0.0596, loss_cls: 0.2546, acc: 91.2959, loss_bbox: 0.3094, loss_mask: 0.2976, loss: 0.9576 2024-05-31 23:53:51,566 - mmdet - INFO - Epoch [2][650/7330] lr: 1.000e-04, eta: 14:19:34, time: 0.632, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0346, loss_rpn_bbox: 0.0560, loss_cls: 0.2501, acc: 91.5806, loss_bbox: 0.3002, loss_mask: 0.2953, loss: 0.9362 2024-05-31 23:54:23,470 - mmdet - INFO - Epoch [2][700/7330] lr: 1.000e-04, eta: 14:18:58, time: 0.638, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0391, loss_rpn_bbox: 0.0632, loss_cls: 0.2628, acc: 90.9783, loss_bbox: 0.3242, loss_mask: 0.3128, loss: 1.0021 2024-05-31 23:55:03,119 - mmdet - INFO - Epoch [2][750/7330] lr: 1.000e-04, eta: 14:19:39, time: 0.793, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0371, loss_rpn_bbox: 0.0564, loss_cls: 0.2505, acc: 91.6057, loss_bbox: 0.3063, loss_mask: 0.3003, loss: 0.9505 2024-05-31 23:55:45,060 - mmdet - INFO - Epoch [2][800/7330] lr: 1.000e-04, eta: 14:20:41, time: 0.839, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0358, loss_rpn_bbox: 0.0548, loss_cls: 0.2447, acc: 91.7305, loss_bbox: 0.2980, loss_mask: 0.2931, loss: 0.9265 2024-05-31 23:56:17,187 - mmdet - INFO - Epoch [2][850/7330] lr: 1.000e-04, eta: 14:20:07, time: 0.643, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0369, loss_rpn_bbox: 0.0602, loss_cls: 0.2509, acc: 91.4883, loss_bbox: 0.3019, loss_mask: 0.3002, loss: 0.9501 2024-05-31 23:56:49,230 - mmdet - INFO - Epoch [2][900/7330] lr: 1.000e-04, eta: 14:19:32, time: 0.641, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0373, loss_rpn_bbox: 0.0591, loss_cls: 0.2625, acc: 91.2234, loss_bbox: 0.3126, loss_mask: 0.3028, loss: 0.9744 2024-05-31 23:57:21,562 - mmdet - INFO - Epoch [2][950/7330] lr: 1.000e-04, eta: 14:18:59, time: 0.647, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0371, loss_rpn_bbox: 0.0595, loss_cls: 0.2711, acc: 90.7727, loss_bbox: 0.3261, loss_mask: 0.3041, loss: 0.9979 2024-05-31 23:57:53,244 - mmdet - INFO - Epoch [2][1000/7330] lr: 1.000e-04, eta: 14:18:20, time: 0.633, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0396, loss_rpn_bbox: 0.0591, loss_cls: 0.2558, acc: 91.2566, loss_bbox: 0.3156, loss_mask: 0.3031, loss: 0.9731 2024-05-31 23:58:25,038 - mmdet - INFO - Epoch [2][1050/7330] lr: 1.000e-04, eta: 14:17:43, time: 0.636, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0366, loss_rpn_bbox: 0.0564, loss_cls: 0.2509, acc: 91.5261, loss_bbox: 0.3020, loss_mask: 0.2971, loss: 0.9430 2024-05-31 23:58:56,766 - mmdet - INFO - Epoch [2][1100/7330] lr: 1.000e-04, eta: 14:17:05, time: 0.634, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0359, loss_rpn_bbox: 0.0579, loss_cls: 0.2572, acc: 91.3547, loss_bbox: 0.3078, loss_mask: 0.2907, loss: 0.9495 2024-05-31 23:59:28,630 - mmdet - INFO - Epoch [2][1150/7330] lr: 1.000e-04, eta: 14:16:28, time: 0.637, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0397, loss_rpn_bbox: 0.0562, loss_cls: 0.2559, acc: 91.2217, loss_bbox: 0.3108, loss_mask: 0.3019, loss: 0.9644 2024-06-01 00:00:00,792 - mmdet - INFO - Epoch [2][1200/7330] lr: 1.000e-04, eta: 14:15:55, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0363, loss_rpn_bbox: 0.0595, loss_cls: 0.2664, acc: 91.0044, loss_bbox: 0.3176, loss_mask: 0.2999, loss: 0.9796 2024-06-01 00:00:34,952 - mmdet - INFO - Epoch [2][1250/7330] lr: 1.000e-04, eta: 14:15:39, time: 0.683, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0375, loss_rpn_bbox: 0.0557, loss_cls: 0.2513, acc: 91.5073, loss_bbox: 0.3046, loss_mask: 0.3030, loss: 0.9521 2024-06-01 00:01:06,918 - mmdet - INFO - Epoch [2][1300/7330] lr: 1.000e-04, eta: 14:15:03, time: 0.639, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0362, loss_rpn_bbox: 0.0556, loss_cls: 0.2533, acc: 91.3108, loss_bbox: 0.3092, loss_mask: 0.2963, loss: 0.9505 2024-06-01 00:01:38,745 - mmdet - INFO - Epoch [2][1350/7330] lr: 1.000e-04, eta: 14:14:26, time: 0.637, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0350, loss_rpn_bbox: 0.0537, loss_cls: 0.2462, acc: 91.4592, loss_bbox: 0.3028, loss_mask: 0.2910, loss: 0.9287 2024-06-01 00:02:11,084 - mmdet - INFO - Epoch [2][1400/7330] lr: 1.000e-04, eta: 14:13:54, time: 0.647, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0363, loss_rpn_bbox: 0.0593, loss_cls: 0.2607, acc: 91.0640, loss_bbox: 0.3191, loss_mask: 0.3017, loss: 0.9771 2024-06-01 00:02:43,032 - mmdet - INFO - Epoch [2][1450/7330] lr: 1.000e-04, eta: 14:13:18, time: 0.639, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0383, loss_rpn_bbox: 0.0601, loss_cls: 0.2622, acc: 91.1230, loss_bbox: 0.3161, loss_mask: 0.2977, loss: 0.9745 2024-06-01 00:03:15,116 - mmdet - INFO - Epoch [2][1500/7330] lr: 1.000e-04, eta: 14:12:44, time: 0.642, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0367, loss_rpn_bbox: 0.0537, loss_cls: 0.2422, acc: 91.8354, loss_bbox: 0.2918, loss_mask: 0.2906, loss: 0.9149 2024-06-01 00:03:46,677 - mmdet - INFO - Epoch [2][1550/7330] lr: 1.000e-04, eta: 14:12:05, time: 0.631, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0379, loss_rpn_bbox: 0.0579, loss_cls: 0.2479, acc: 91.8215, loss_bbox: 0.2887, loss_mask: 0.2900, loss: 0.9223 2024-06-01 00:04:18,494 - mmdet - INFO - Epoch [2][1600/7330] lr: 1.000e-04, eta: 14:11:28, time: 0.636, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0540, loss_cls: 0.2490, acc: 91.5774, loss_bbox: 0.3030, loss_mask: 0.2929, loss: 0.9334 2024-06-01 00:04:50,102 - mmdet - INFO - Epoch [2][1650/7330] lr: 1.000e-04, eta: 14:10:49, time: 0.632, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0366, loss_rpn_bbox: 0.0545, loss_cls: 0.2456, acc: 91.6011, loss_bbox: 0.3000, loss_mask: 0.2971, loss: 0.9338 2024-06-01 00:05:22,095 - mmdet - INFO - Epoch [2][1700/7330] lr: 1.000e-04, eta: 14:10:14, time: 0.640, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0372, loss_rpn_bbox: 0.0563, loss_cls: 0.2485, acc: 91.6311, loss_bbox: 0.2997, loss_mask: 0.2941, loss: 0.9358 2024-06-01 00:05:54,287 - mmdet - INFO - Epoch [2][1750/7330] lr: 1.000e-04, eta: 14:09:41, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0348, loss_rpn_bbox: 0.0548, loss_cls: 0.2514, acc: 91.4482, loss_bbox: 0.2994, loss_mask: 0.2924, loss: 0.9329 2024-06-01 00:06:26,242 - mmdet - INFO - Epoch [2][1800/7330] lr: 1.000e-04, eta: 14:09:05, time: 0.639, data_time: 0.044, memory: 13277, loss_rpn_cls: 0.0376, loss_rpn_bbox: 0.0565, loss_cls: 0.2510, acc: 91.2688, loss_bbox: 0.3111, loss_mask: 0.2951, loss: 0.9513 2024-06-01 00:07:07,645 - mmdet - INFO - Epoch [2][1850/7330] lr: 1.000e-04, eta: 14:09:51, time: 0.828, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0390, loss_rpn_bbox: 0.0565, loss_cls: 0.2469, acc: 91.6509, loss_bbox: 0.3032, loss_mask: 0.2950, loss: 0.9406 2024-06-01 00:07:45,587 - mmdet - INFO - Epoch [2][1900/7330] lr: 1.000e-04, eta: 14:10:06, time: 0.759, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0403, loss_rpn_bbox: 0.0591, loss_cls: 0.2558, acc: 91.2053, loss_bbox: 0.3084, loss_mask: 0.2982, loss: 0.9617 2024-06-01 00:08:17,482 - mmdet - INFO - Epoch [2][1950/7330] lr: 1.000e-04, eta: 14:09:29, time: 0.638, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0359, loss_rpn_bbox: 0.0550, loss_cls: 0.2547, acc: 91.3118, loss_bbox: 0.3093, loss_mask: 0.2857, loss: 0.9405 2024-06-01 00:08:49,430 - mmdet - INFO - Epoch [2][2000/7330] lr: 1.000e-04, eta: 14:08:53, time: 0.639, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0349, loss_rpn_bbox: 0.0560, loss_cls: 0.2480, acc: 91.4797, loss_bbox: 0.3050, loss_mask: 0.2918, loss: 0.9357 2024-06-01 00:09:21,286 - mmdet - INFO - Epoch [2][2050/7330] lr: 1.000e-04, eta: 14:08:16, time: 0.637, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0378, loss_rpn_bbox: 0.0582, loss_cls: 0.2564, acc: 91.1516, loss_bbox: 0.3172, loss_mask: 0.2987, loss: 0.9682 2024-06-01 00:09:53,039 - mmdet - INFO - Epoch [2][2100/7330] lr: 1.000e-04, eta: 14:07:39, time: 0.635, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0371, loss_rpn_bbox: 0.0570, loss_cls: 0.2493, acc: 91.4749, loss_bbox: 0.3047, loss_mask: 0.2903, loss: 0.9385 2024-06-01 00:10:24,928 - mmdet - INFO - Epoch [2][2150/7330] lr: 1.000e-04, eta: 14:07:02, time: 0.638, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0380, loss_rpn_bbox: 0.0553, loss_cls: 0.2445, acc: 91.6479, loss_bbox: 0.3016, loss_mask: 0.2917, loss: 0.9311 2024-06-01 00:10:56,424 - mmdet - INFO - Epoch [2][2200/7330] lr: 1.000e-04, eta: 14:06:23, time: 0.630, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0501, loss_cls: 0.2372, acc: 91.9302, loss_bbox: 0.2908, loss_mask: 0.2899, loss: 0.9005 2024-06-01 00:11:28,083 - mmdet - INFO - Epoch [2][2250/7330] lr: 1.000e-04, eta: 14:05:44, time: 0.633, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0363, loss_rpn_bbox: 0.0535, loss_cls: 0.2439, acc: 91.7639, loss_bbox: 0.2902, loss_mask: 0.2932, loss: 0.9172 2024-06-01 00:12:02,113 - mmdet - INFO - Epoch [2][2300/7330] lr: 1.000e-04, eta: 14:05:26, time: 0.681, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0357, loss_rpn_bbox: 0.0525, loss_cls: 0.2399, acc: 91.7080, loss_bbox: 0.2908, loss_mask: 0.2887, loss: 0.9075 2024-06-01 00:12:34,218 - mmdet - INFO - Epoch [2][2350/7330] lr: 1.000e-04, eta: 14:04:51, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0545, loss_cls: 0.2476, acc: 91.5671, loss_bbox: 0.2950, loss_mask: 0.2854, loss: 0.9161 2024-06-01 00:13:06,211 - mmdet - INFO - Epoch [2][2400/7330] lr: 1.000e-04, eta: 14:04:15, time: 0.640, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0363, loss_rpn_bbox: 0.0580, loss_cls: 0.2502, acc: 91.3560, loss_bbox: 0.3137, loss_mask: 0.2979, loss: 0.9561 2024-06-01 00:13:37,949 - mmdet - INFO - Epoch [2][2450/7330] lr: 1.000e-04, eta: 14:03:38, time: 0.635, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0326, loss_rpn_bbox: 0.0498, loss_cls: 0.2372, acc: 91.9434, loss_bbox: 0.2877, loss_mask: 0.2855, loss: 0.8928 2024-06-01 00:14:09,921 - mmdet - INFO - Epoch [2][2500/7330] lr: 1.000e-04, eta: 14:03:02, time: 0.639, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0341, loss_rpn_bbox: 0.0559, loss_cls: 0.2480, acc: 91.5725, loss_bbox: 0.3039, loss_mask: 0.2984, loss: 0.9403 2024-06-01 00:14:41,770 - mmdet - INFO - Epoch [2][2550/7330] lr: 1.000e-04, eta: 14:02:26, time: 0.637, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0341, loss_rpn_bbox: 0.0557, loss_cls: 0.2472, acc: 91.5874, loss_bbox: 0.3037, loss_mask: 0.2960, loss: 0.9367 2024-06-01 00:15:13,628 - mmdet - INFO - Epoch [2][2600/7330] lr: 1.000e-04, eta: 14:01:50, time: 0.637, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0363, loss_rpn_bbox: 0.0586, loss_cls: 0.2454, acc: 91.6418, loss_bbox: 0.2954, loss_mask: 0.2937, loss: 0.9294 2024-06-01 00:15:45,378 - mmdet - INFO - Epoch [2][2650/7330] lr: 1.000e-04, eta: 14:01:12, time: 0.635, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0362, loss_rpn_bbox: 0.0550, loss_cls: 0.2472, acc: 91.5046, loss_bbox: 0.3022, loss_mask: 0.2926, loss: 0.9333 2024-06-01 00:16:17,094 - mmdet - INFO - Epoch [2][2700/7330] lr: 1.000e-04, eta: 14:00:35, time: 0.634, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0328, loss_rpn_bbox: 0.0517, loss_cls: 0.2327, acc: 91.9941, loss_bbox: 0.2839, loss_mask: 0.2864, loss: 0.8874 2024-06-01 00:16:48,726 - mmdet - INFO - Epoch [2][2750/7330] lr: 1.000e-04, eta: 13:59:57, time: 0.633, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0554, loss_cls: 0.2428, acc: 91.7341, loss_bbox: 0.2967, loss_mask: 0.2880, loss: 0.9173 2024-06-01 00:17:20,360 - mmdet - INFO - Epoch [2][2800/7330] lr: 1.000e-04, eta: 13:59:19, time: 0.633, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0343, loss_rpn_bbox: 0.0517, loss_cls: 0.2402, acc: 91.8779, loss_bbox: 0.2891, loss_mask: 0.2855, loss: 0.9007 2024-06-01 00:17:52,216 - mmdet - INFO - Epoch [2][2850/7330] lr: 1.000e-04, eta: 13:58:43, time: 0.637, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0396, loss_rpn_bbox: 0.0591, loss_cls: 0.2520, acc: 91.4465, loss_bbox: 0.3019, loss_mask: 0.2881, loss: 0.9407 2024-06-01 00:18:26,716 - mmdet - INFO - Epoch [2][2900/7330] lr: 1.000e-04, eta: 13:58:27, time: 0.690, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0361, loss_rpn_bbox: 0.0557, loss_cls: 0.2460, acc: 91.6252, loss_bbox: 0.2932, loss_mask: 0.2857, loss: 0.9167 2024-06-01 00:19:10,074 - mmdet - INFO - Epoch [2][2950/7330] lr: 1.000e-04, eta: 13:59:18, time: 0.867, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0362, loss_rpn_bbox: 0.0544, loss_cls: 0.2454, acc: 91.5332, loss_bbox: 0.3002, loss_mask: 0.2917, loss: 0.9280 2024-06-01 00:19:41,835 - mmdet - INFO - Epoch [2][3000/7330] lr: 1.000e-04, eta: 13:58:40, time: 0.635, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0365, loss_rpn_bbox: 0.0540, loss_cls: 0.2352, acc: 92.0244, loss_bbox: 0.2832, loss_mask: 0.2797, loss: 0.8886 2024-06-01 00:20:13,872 - mmdet - INFO - Epoch [2][3050/7330] lr: 1.000e-04, eta: 13:58:05, time: 0.641, data_time: 0.042, memory: 13277, loss_rpn_cls: 0.0360, loss_rpn_bbox: 0.0578, loss_cls: 0.2482, acc: 91.4910, loss_bbox: 0.2972, loss_mask: 0.2957, loss: 0.9349 2024-06-01 00:20:46,136 - mmdet - INFO - Epoch [2][3100/7330] lr: 1.000e-04, eta: 13:57:32, time: 0.645, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0358, loss_rpn_bbox: 0.0576, loss_cls: 0.2475, acc: 91.5449, loss_bbox: 0.3012, loss_mask: 0.2962, loss: 0.9384 2024-06-01 00:21:18,197 - mmdet - INFO - Epoch [2][3150/7330] lr: 1.000e-04, eta: 13:56:57, time: 0.641, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0375, loss_rpn_bbox: 0.0530, loss_cls: 0.2437, acc: 91.6858, loss_bbox: 0.2924, loss_mask: 0.2894, loss: 0.9160 2024-06-01 00:21:50,245 - mmdet - INFO - Epoch [2][3200/7330] lr: 1.000e-04, eta: 13:56:22, time: 0.641, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0352, loss_rpn_bbox: 0.0573, loss_cls: 0.2458, acc: 91.5879, loss_bbox: 0.3030, loss_mask: 0.2956, loss: 0.9369 2024-06-01 00:22:22,031 - mmdet - INFO - Epoch [2][3250/7330] lr: 1.000e-04, eta: 13:55:45, time: 0.636, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0341, loss_rpn_bbox: 0.0569, loss_cls: 0.2537, acc: 91.3245, loss_bbox: 0.3046, loss_mask: 0.2999, loss: 0.9491 2024-06-01 00:22:54,013 - mmdet - INFO - Epoch [2][3300/7330] lr: 1.000e-04, eta: 13:55:09, time: 0.640, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0533, loss_cls: 0.2368, acc: 91.8215, loss_bbox: 0.2952, loss_mask: 0.2890, loss: 0.9051 2024-06-01 00:23:25,687 - mmdet - INFO - Epoch [2][3350/7330] lr: 1.000e-04, eta: 13:54:32, time: 0.633, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0355, loss_rpn_bbox: 0.0553, loss_cls: 0.2442, acc: 91.7305, loss_bbox: 0.2927, loss_mask: 0.2974, loss: 0.9251 2024-06-01 00:24:00,057 - mmdet - INFO - Epoch [2][3400/7330] lr: 1.000e-04, eta: 13:54:13, time: 0.687, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0329, loss_rpn_bbox: 0.0532, loss_cls: 0.2365, acc: 92.0178, loss_bbox: 0.2869, loss_mask: 0.2867, loss: 0.8961 2024-06-01 00:24:31,911 - mmdet - INFO - Epoch [2][3450/7330] lr: 1.000e-04, eta: 13:53:37, time: 0.637, data_time: 0.039, memory: 13277, loss_rpn_cls: 0.0356, loss_rpn_bbox: 0.0562, loss_cls: 0.2471, acc: 91.6580, loss_bbox: 0.3003, loss_mask: 0.2901, loss: 0.9293 2024-06-01 00:25:03,852 - mmdet - INFO - Epoch [2][3500/7330] lr: 1.000e-04, eta: 13:53:01, time: 0.639, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0550, loss_cls: 0.2370, acc: 91.9041, loss_bbox: 0.2882, loss_mask: 0.2855, loss: 0.9002 2024-06-01 00:25:35,929 - mmdet - INFO - Epoch [2][3550/7330] lr: 1.000e-04, eta: 13:52:27, time: 0.642, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0565, loss_cls: 0.2424, acc: 91.7517, loss_bbox: 0.2980, loss_mask: 0.2900, loss: 0.9215 2024-06-01 00:26:07,652 - mmdet - INFO - Epoch [2][3600/7330] lr: 1.000e-04, eta: 13:51:49, time: 0.634, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0370, loss_rpn_bbox: 0.0536, loss_cls: 0.2425, acc: 91.8921, loss_bbox: 0.2895, loss_mask: 0.2882, loss: 0.9109 2024-06-01 00:26:39,440 - mmdet - INFO - Epoch [2][3650/7330] lr: 1.000e-04, eta: 13:51:13, time: 0.636, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0341, loss_rpn_bbox: 0.0531, loss_cls: 0.2472, acc: 91.6372, loss_bbox: 0.2959, loss_mask: 0.2927, loss: 0.9230 2024-06-01 00:27:11,426 - mmdet - INFO - Epoch [2][3700/7330] lr: 1.000e-04, eta: 13:50:38, time: 0.640, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0315, loss_rpn_bbox: 0.0543, loss_cls: 0.2430, acc: 91.6711, loss_bbox: 0.2997, loss_mask: 0.2904, loss: 0.9189 2024-06-01 00:27:42,961 - mmdet - INFO - Epoch [2][3750/7330] lr: 1.000e-04, eta: 13:49:59, time: 0.631, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0342, loss_rpn_bbox: 0.0555, loss_cls: 0.2434, acc: 91.6648, loss_bbox: 0.2923, loss_mask: 0.2898, loss: 0.9152 2024-06-01 00:28:14,989 - mmdet - INFO - Epoch [2][3800/7330] lr: 1.000e-04, eta: 13:49:24, time: 0.641, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0349, loss_rpn_bbox: 0.0534, loss_cls: 0.2501, acc: 91.4387, loss_bbox: 0.3069, loss_mask: 0.2939, loss: 0.9393 2024-06-01 00:28:47,110 - mmdet - INFO - Epoch [2][3850/7330] lr: 1.000e-04, eta: 13:48:50, time: 0.642, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0362, loss_rpn_bbox: 0.0565, loss_cls: 0.2419, acc: 91.6609, loss_bbox: 0.2945, loss_mask: 0.2881, loss: 0.9172 2024-06-01 00:29:18,810 - mmdet - INFO - Epoch [2][3900/7330] lr: 1.000e-04, eta: 13:48:13, time: 0.634, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0552, loss_cls: 0.2468, acc: 91.5071, loss_bbox: 0.2935, loss_mask: 0.2887, loss: 0.9178 2024-06-01 00:29:50,509 - mmdet - INFO - Epoch [2][3950/7330] lr: 1.000e-04, eta: 13:47:36, time: 0.634, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0331, loss_rpn_bbox: 0.0532, loss_cls: 0.2369, acc: 92.0964, loss_bbox: 0.2793, loss_mask: 0.2745, loss: 0.8769 2024-06-01 00:30:30,366 - mmdet - INFO - Epoch [2][4000/7330] lr: 1.000e-04, eta: 13:47:54, time: 0.797, data_time: 0.041, memory: 13277, loss_rpn_cls: 0.0357, loss_rpn_bbox: 0.0575, loss_cls: 0.2513, acc: 91.4189, loss_bbox: 0.3040, loss_mask: 0.2910, loss: 0.9396 2024-06-01 00:31:11,146 - mmdet - INFO - Epoch [2][4050/7330] lr: 1.000e-04, eta: 13:48:18, time: 0.816, data_time: 0.039, memory: 13277, loss_rpn_cls: 0.0348, loss_rpn_bbox: 0.0552, loss_cls: 0.2414, acc: 91.7305, loss_bbox: 0.2961, loss_mask: 0.2892, loss: 0.9167 2024-06-01 00:31:43,254 - mmdet - INFO - Epoch [2][4100/7330] lr: 1.000e-04, eta: 13:47:43, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0333, loss_rpn_bbox: 0.0557, loss_cls: 0.2403, acc: 91.8562, loss_bbox: 0.2864, loss_mask: 0.2854, loss: 0.9011 2024-06-01 00:32:15,386 - mmdet - INFO - Epoch [2][4150/7330] lr: 1.000e-04, eta: 13:47:09, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0540, loss_cls: 0.2367, acc: 91.9307, loss_bbox: 0.2884, loss_mask: 0.2834, loss: 0.8973 2024-06-01 00:32:47,371 - mmdet - INFO - Epoch [2][4200/7330] lr: 1.000e-04, eta: 13:46:33, time: 0.640, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0331, loss_rpn_bbox: 0.0518, loss_cls: 0.2376, acc: 91.8250, loss_bbox: 0.2905, loss_mask: 0.2780, loss: 0.8909 2024-06-01 00:33:19,674 - mmdet - INFO - Epoch [2][4250/7330] lr: 1.000e-04, eta: 13:46:00, time: 0.646, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0358, loss_rpn_bbox: 0.0552, loss_cls: 0.2427, acc: 91.7354, loss_bbox: 0.2945, loss_mask: 0.2916, loss: 0.9198 2024-06-01 00:33:51,619 - mmdet - INFO - Epoch [2][4300/7330] lr: 1.000e-04, eta: 13:45:24, time: 0.639, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0334, loss_rpn_bbox: 0.0547, loss_cls: 0.2505, acc: 91.2737, loss_bbox: 0.3069, loss_mask: 0.2891, loss: 0.9346 2024-06-01 00:34:23,322 - mmdet - INFO - Epoch [2][4350/7330] lr: 1.000e-04, eta: 13:44:47, time: 0.634, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0342, loss_rpn_bbox: 0.0560, loss_cls: 0.2436, acc: 91.5715, loss_bbox: 0.3008, loss_mask: 0.2839, loss: 0.9184 2024-06-01 00:34:54,900 - mmdet - INFO - Epoch [2][4400/7330] lr: 1.000e-04, eta: 13:44:09, time: 0.632, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0331, loss_rpn_bbox: 0.0543, loss_cls: 0.2392, acc: 91.8115, loss_bbox: 0.2927, loss_mask: 0.2915, loss: 0.9109 2024-06-01 00:35:26,778 - mmdet - INFO - Epoch [2][4450/7330] lr: 1.000e-04, eta: 13:43:33, time: 0.638, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0319, loss_rpn_bbox: 0.0518, loss_cls: 0.2352, acc: 91.8691, loss_bbox: 0.2905, loss_mask: 0.2908, loss: 0.9002 2024-06-01 00:36:01,034 - mmdet - INFO - Epoch [2][4500/7330] lr: 1.000e-04, eta: 13:43:12, time: 0.685, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0352, loss_rpn_bbox: 0.0572, loss_cls: 0.2416, acc: 91.6746, loss_bbox: 0.2945, loss_mask: 0.2804, loss: 0.9089 2024-06-01 00:36:32,849 - mmdet - INFO - Epoch [2][4550/7330] lr: 1.000e-04, eta: 13:42:36, time: 0.636, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0314, loss_rpn_bbox: 0.0529, loss_cls: 0.2424, acc: 91.7009, loss_bbox: 0.2910, loss_mask: 0.2828, loss: 0.9005 2024-06-01 00:37:04,652 - mmdet - INFO - Epoch [2][4600/7330] lr: 1.000e-04, eta: 13:41:59, time: 0.636, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0344, loss_rpn_bbox: 0.0563, loss_cls: 0.2402, acc: 91.8674, loss_bbox: 0.2898, loss_mask: 0.2866, loss: 0.9074 2024-06-01 00:37:36,523 - mmdet - INFO - Epoch [2][4650/7330] lr: 1.000e-04, eta: 13:41:23, time: 0.637, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0555, loss_cls: 0.2387, acc: 91.7190, loss_bbox: 0.2925, loss_mask: 0.2917, loss: 0.9131 2024-06-01 00:38:08,450 - mmdet - INFO - Epoch [2][4700/7330] lr: 1.000e-04, eta: 13:40:48, time: 0.639, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0356, loss_rpn_bbox: 0.0538, loss_cls: 0.2325, acc: 91.9719, loss_bbox: 0.2832, loss_mask: 0.2771, loss: 0.8822 2024-06-01 00:38:40,418 - mmdet - INFO - Epoch [2][4750/7330] lr: 1.000e-04, eta: 13:40:12, time: 0.639, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0381, loss_rpn_bbox: 0.0558, loss_cls: 0.2520, acc: 91.2646, loss_bbox: 0.3028, loss_mask: 0.2894, loss: 0.9381 2024-06-01 00:39:12,264 - mmdet - INFO - Epoch [2][4800/7330] lr: 1.000e-04, eta: 13:39:36, time: 0.637, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0328, loss_rpn_bbox: 0.0528, loss_cls: 0.2297, acc: 92.1326, loss_bbox: 0.2823, loss_mask: 0.2795, loss: 0.8772 2024-06-01 00:39:44,408 - mmdet - INFO - Epoch [2][4850/7330] lr: 1.000e-04, eta: 13:39:02, time: 0.643, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0315, loss_rpn_bbox: 0.0527, loss_cls: 0.2327, acc: 91.8347, loss_bbox: 0.2905, loss_mask: 0.2751, loss: 0.8826 2024-06-01 00:40:16,093 - mmdet - INFO - Epoch [2][4900/7330] lr: 1.000e-04, eta: 13:38:25, time: 0.634, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0542, loss_cls: 0.2348, acc: 91.8430, loss_bbox: 0.2896, loss_mask: 0.2823, loss: 0.8911 2024-06-01 00:40:48,309 - mmdet - INFO - Epoch [2][4950/7330] lr: 1.000e-04, eta: 13:37:51, time: 0.644, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0352, loss_rpn_bbox: 0.0555, loss_cls: 0.2428, acc: 91.5957, loss_bbox: 0.3003, loss_mask: 0.2932, loss: 0.9270 2024-06-01 00:41:19,869 - mmdet - INFO - Epoch [2][5000/7330] lr: 1.000e-04, eta: 13:37:14, time: 0.631, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0289, loss_rpn_bbox: 0.0517, loss_cls: 0.2316, acc: 92.0159, loss_bbox: 0.2814, loss_mask: 0.2745, loss: 0.8680 2024-06-01 00:41:51,998 - mmdet - INFO - Epoch [2][5050/7330] lr: 1.000e-04, eta: 13:36:40, time: 0.643, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0551, loss_cls: 0.2461, acc: 91.5107, loss_bbox: 0.2993, loss_mask: 0.2875, loss: 0.9227 2024-06-01 00:42:35,529 - mmdet - INFO - Epoch [2][5100/7330] lr: 1.000e-04, eta: 13:37:15, time: 0.871, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0529, loss_cls: 0.2426, acc: 91.6697, loss_bbox: 0.2899, loss_mask: 0.2848, loss: 0.9049 2024-06-01 00:43:09,691 - mmdet - INFO - Epoch [2][5150/7330] lr: 1.000e-04, eta: 13:36:52, time: 0.683, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0326, loss_rpn_bbox: 0.0558, loss_cls: 0.2527, acc: 91.3433, loss_bbox: 0.3019, loss_mask: 0.2889, loss: 0.9319 2024-06-01 00:43:41,483 - mmdet - INFO - Epoch [2][5200/7330] lr: 1.000e-04, eta: 13:36:16, time: 0.636, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0522, loss_cls: 0.2313, acc: 92.0757, loss_bbox: 0.2819, loss_mask: 0.2775, loss: 0.8765 2024-06-01 00:44:13,416 - mmdet - INFO - Epoch [2][5250/7330] lr: 1.000e-04, eta: 13:35:40, time: 0.639, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0354, loss_rpn_bbox: 0.0548, loss_cls: 0.2371, acc: 91.7529, loss_bbox: 0.2910, loss_mask: 0.2850, loss: 0.9033 2024-06-01 00:44:45,136 - mmdet - INFO - Epoch [2][5300/7330] lr: 1.000e-04, eta: 13:35:03, time: 0.634, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0537, loss_cls: 0.2328, acc: 92.0840, loss_bbox: 0.2837, loss_mask: 0.2793, loss: 0.8830 2024-06-01 00:45:16,723 - mmdet - INFO - Epoch [2][5350/7330] lr: 1.000e-04, eta: 13:34:26, time: 0.631, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0368, loss_rpn_bbox: 0.0528, loss_cls: 0.2336, acc: 92.0237, loss_bbox: 0.2850, loss_mask: 0.2823, loss: 0.8906 2024-06-01 00:45:48,495 - mmdet - INFO - Epoch [2][5400/7330] lr: 1.000e-04, eta: 13:33:49, time: 0.636, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0329, loss_rpn_bbox: 0.0528, loss_cls: 0.2391, acc: 91.6731, loss_bbox: 0.2928, loss_mask: 0.2811, loss: 0.8986 2024-06-01 00:46:20,160 - mmdet - INFO - Epoch [2][5450/7330] lr: 1.000e-04, eta: 13:33:12, time: 0.633, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0332, loss_rpn_bbox: 0.0531, loss_cls: 0.2353, acc: 91.9216, loss_bbox: 0.2847, loss_mask: 0.2795, loss: 0.8858 2024-06-01 00:46:52,486 - mmdet - INFO - Epoch [2][5500/7330] lr: 1.000e-04, eta: 13:32:39, time: 0.647, data_time: 0.044, memory: 13277, loss_rpn_cls: 0.0354, loss_rpn_bbox: 0.0574, loss_cls: 0.2442, acc: 91.5222, loss_bbox: 0.3027, loss_mask: 0.2887, loss: 0.9284 2024-06-01 00:47:27,150 - mmdet - INFO - Epoch [2][5550/7330] lr: 1.000e-04, eta: 13:32:19, time: 0.693, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0354, loss_rpn_bbox: 0.0545, loss_cls: 0.2347, acc: 91.8630, loss_bbox: 0.2886, loss_mask: 0.2831, loss: 0.8962 2024-06-01 00:47:58,803 - mmdet - INFO - Epoch [2][5600/7330] lr: 1.000e-04, eta: 13:31:42, time: 0.633, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0340, loss_rpn_bbox: 0.0532, loss_cls: 0.2375, acc: 91.7107, loss_bbox: 0.2953, loss_mask: 0.2788, loss: 0.8988 2024-06-01 00:48:30,860 - mmdet - INFO - Epoch [2][5650/7330] lr: 1.000e-04, eta: 13:31:08, time: 0.641, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0355, loss_rpn_bbox: 0.0577, loss_cls: 0.2473, acc: 91.4324, loss_bbox: 0.3079, loss_mask: 0.2915, loss: 0.9398 2024-06-01 00:49:02,533 - mmdet - INFO - Epoch [2][5700/7330] lr: 1.000e-04, eta: 13:30:31, time: 0.633, data_time: 0.039, memory: 13277, loss_rpn_cls: 0.0349, loss_rpn_bbox: 0.0544, loss_cls: 0.2407, acc: 91.8252, loss_bbox: 0.2968, loss_mask: 0.2905, loss: 0.9174 2024-06-01 00:49:34,098 - mmdet - INFO - Epoch [2][5750/7330] lr: 1.000e-04, eta: 13:29:53, time: 0.631, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0358, loss_rpn_bbox: 0.0537, loss_cls: 0.2328, acc: 91.9724, loss_bbox: 0.2865, loss_mask: 0.2823, loss: 0.8911 2024-06-01 00:50:06,244 - mmdet - INFO - Epoch [2][5800/7330] lr: 1.000e-04, eta: 13:29:19, time: 0.643, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0538, loss_cls: 0.2399, acc: 91.6770, loss_bbox: 0.2927, loss_mask: 0.2810, loss: 0.8982 2024-06-01 00:50:37,987 - mmdet - INFO - Epoch [2][5850/7330] lr: 1.000e-04, eta: 13:28:42, time: 0.635, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0522, loss_cls: 0.2396, acc: 91.7593, loss_bbox: 0.2913, loss_mask: 0.2831, loss: 0.8979 2024-06-01 00:51:09,720 - mmdet - INFO - Epoch [2][5900/7330] lr: 1.000e-04, eta: 13:28:06, time: 0.635, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0342, loss_rpn_bbox: 0.0532, loss_cls: 0.2370, acc: 91.8933, loss_bbox: 0.2835, loss_mask: 0.2786, loss: 0.8864 2024-06-01 00:51:41,831 - mmdet - INFO - Epoch [2][5950/7330] lr: 1.000e-04, eta: 13:27:32, time: 0.642, data_time: 0.042, memory: 13277, loss_rpn_cls: 0.0341, loss_rpn_bbox: 0.0545, loss_cls: 0.2376, acc: 91.8440, loss_bbox: 0.2889, loss_mask: 0.2840, loss: 0.8991 2024-06-01 00:52:13,884 - mmdet - INFO - Epoch [2][6000/7330] lr: 1.000e-04, eta: 13:26:57, time: 0.641, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0551, loss_cls: 0.2386, acc: 91.8276, loss_bbox: 0.2864, loss_mask: 0.2816, loss: 0.8954 2024-06-01 00:52:45,775 - mmdet - INFO - Epoch [2][6050/7330] lr: 1.000e-04, eta: 13:26:22, time: 0.638, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0348, loss_rpn_bbox: 0.0541, loss_cls: 0.2412, acc: 91.7761, loss_bbox: 0.2914, loss_mask: 0.2868, loss: 0.9084 2024-06-01 00:53:17,129 - mmdet - INFO - Epoch [2][6100/7330] lr: 1.000e-04, eta: 13:25:43, time: 0.627, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0316, loss_rpn_bbox: 0.0528, loss_cls: 0.2405, acc: 91.8394, loss_bbox: 0.2884, loss_mask: 0.2831, loss: 0.8964 2024-06-01 00:53:54,377 - mmdet - INFO - Epoch [2][6150/7330] lr: 1.000e-04, eta: 13:25:37, time: 0.745, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0354, loss_rpn_bbox: 0.0580, loss_cls: 0.2439, acc: 91.5508, loss_bbox: 0.2941, loss_mask: 0.2820, loss: 0.9134 2024-06-01 00:54:36,361 - mmdet - INFO - Epoch [2][6200/7330] lr: 1.000e-04, eta: 13:25:57, time: 0.840, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0355, loss_rpn_bbox: 0.0559, loss_cls: 0.2381, acc: 91.7708, loss_bbox: 0.2896, loss_mask: 0.2838, loss: 0.9029 2024-06-01 00:55:08,483 - mmdet - INFO - Epoch [2][6250/7330] lr: 1.000e-04, eta: 13:25:23, time: 0.642, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0330, loss_rpn_bbox: 0.0533, loss_cls: 0.2456, acc: 91.6045, loss_bbox: 0.2927, loss_mask: 0.2814, loss: 0.9060 2024-06-01 00:55:40,229 - mmdet - INFO - Epoch [2][6300/7330] lr: 1.000e-04, eta: 13:24:46, time: 0.635, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0350, loss_rpn_bbox: 0.0510, loss_cls: 0.2320, acc: 92.1501, loss_bbox: 0.2787, loss_mask: 0.2798, loss: 0.8765 2024-06-01 00:56:12,034 - mmdet - INFO - Epoch [2][6350/7330] lr: 1.000e-04, eta: 13:24:10, time: 0.636, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0344, loss_rpn_bbox: 0.0524, loss_cls: 0.2343, acc: 91.8601, loss_bbox: 0.2900, loss_mask: 0.2856, loss: 0.8966 2024-06-01 00:56:43,829 - mmdet - INFO - Epoch [2][6400/7330] lr: 1.000e-04, eta: 13:23:34, time: 0.636, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0536, loss_cls: 0.2353, acc: 91.8103, loss_bbox: 0.2852, loss_mask: 0.2720, loss: 0.8784 2024-06-01 00:57:15,758 - mmdet - INFO - Epoch [2][6450/7330] lr: 1.000e-04, eta: 13:22:59, time: 0.639, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0343, loss_rpn_bbox: 0.0536, loss_cls: 0.2350, acc: 91.8994, loss_bbox: 0.2875, loss_mask: 0.2761, loss: 0.8864 2024-06-01 00:57:47,386 - mmdet - INFO - Epoch [2][6500/7330] lr: 1.000e-04, eta: 13:22:22, time: 0.633, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0339, loss_rpn_bbox: 0.0545, loss_cls: 0.2379, acc: 91.7930, loss_bbox: 0.2891, loss_mask: 0.2863, loss: 0.9016 2024-06-01 00:58:19,073 - mmdet - INFO - Epoch [2][6550/7330] lr: 1.000e-04, eta: 13:21:45, time: 0.634, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0534, loss_cls: 0.2341, acc: 91.8787, loss_bbox: 0.2854, loss_mask: 0.2762, loss: 0.8815 2024-06-01 00:58:50,762 - mmdet - INFO - Epoch [2][6600/7330] lr: 1.000e-04, eta: 13:21:08, time: 0.634, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0514, loss_cls: 0.2376, acc: 91.8103, loss_bbox: 0.2907, loss_mask: 0.2838, loss: 0.8947 2024-06-01 00:59:25,486 - mmdet - INFO - Epoch [2][6650/7330] lr: 1.000e-04, eta: 13:20:48, time: 0.694, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0532, loss_cls: 0.2220, acc: 92.2881, loss_bbox: 0.2746, loss_mask: 0.2758, loss: 0.8574 2024-06-01 00:59:57,877 - mmdet - INFO - Epoch [2][6700/7330] lr: 1.000e-04, eta: 13:20:15, time: 0.648, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0524, loss_cls: 0.2285, acc: 92.0601, loss_bbox: 0.2812, loss_mask: 0.2852, loss: 0.8797 2024-06-01 01:00:29,387 - mmdet - INFO - Epoch [2][6750/7330] lr: 1.000e-04, eta: 13:19:37, time: 0.630, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0559, loss_cls: 0.2297, acc: 92.1533, loss_bbox: 0.2762, loss_mask: 0.2799, loss: 0.8740 2024-06-01 01:01:01,459 - mmdet - INFO - Epoch [2][6800/7330] lr: 1.000e-04, eta: 13:19:03, time: 0.641, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0529, loss_cls: 0.2256, acc: 92.1997, loss_bbox: 0.2793, loss_mask: 0.2819, loss: 0.8690 2024-06-01 01:01:33,404 - mmdet - INFO - Epoch [2][6850/7330] lr: 1.000e-04, eta: 13:18:28, time: 0.639, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0321, loss_rpn_bbox: 0.0540, loss_cls: 0.2376, acc: 91.9080, loss_bbox: 0.2872, loss_mask: 0.2879, loss: 0.8988 2024-06-01 01:02:05,221 - mmdet - INFO - Epoch [2][6900/7330] lr: 1.000e-04, eta: 13:17:52, time: 0.636, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0346, loss_rpn_bbox: 0.0561, loss_cls: 0.2297, acc: 92.1680, loss_bbox: 0.2753, loss_mask: 0.2759, loss: 0.8717 2024-06-01 01:02:36,881 - mmdet - INFO - Epoch [2][6950/7330] lr: 1.000e-04, eta: 13:17:15, time: 0.634, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0328, loss_rpn_bbox: 0.0514, loss_cls: 0.2290, acc: 92.1143, loss_bbox: 0.2746, loss_mask: 0.2691, loss: 0.8569 2024-06-01 01:03:09,253 - mmdet - INFO - Epoch [2][7000/7330] lr: 1.000e-04, eta: 13:16:42, time: 0.647, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0324, loss_rpn_bbox: 0.0527, loss_cls: 0.2434, acc: 91.7290, loss_bbox: 0.2902, loss_mask: 0.2818, loss: 0.9006 2024-06-01 01:03:41,242 - mmdet - INFO - Epoch [2][7050/7330] lr: 1.000e-04, eta: 13:16:07, time: 0.639, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0538, loss_cls: 0.2334, acc: 91.9819, loss_bbox: 0.2872, loss_mask: 0.2770, loss: 0.8853 2024-06-01 01:04:13,064 - mmdet - INFO - Epoch [2][7100/7330] lr: 1.000e-04, eta: 13:15:32, time: 0.637, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0350, loss_rpn_bbox: 0.0531, loss_cls: 0.2265, acc: 92.0710, loss_bbox: 0.2807, loss_mask: 0.2782, loss: 0.8735 2024-06-01 01:04:44,923 - mmdet - INFO - Epoch [2][7150/7330] lr: 1.000e-04, eta: 13:14:56, time: 0.637, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0537, loss_cls: 0.2423, acc: 91.7498, loss_bbox: 0.2921, loss_mask: 0.2770, loss: 0.8953 2024-06-01 01:05:16,731 - mmdet - INFO - Epoch [2][7200/7330] lr: 1.000e-04, eta: 13:14:20, time: 0.636, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0494, loss_cls: 0.2256, acc: 92.2397, loss_bbox: 0.2769, loss_mask: 0.2773, loss: 0.8590 2024-06-01 01:05:56,435 - mmdet - INFO - Epoch [2][7250/7330] lr: 1.000e-04, eta: 13:14:24, time: 0.794, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0331, loss_rpn_bbox: 0.0543, loss_cls: 0.2319, acc: 91.9963, loss_bbox: 0.2790, loss_mask: 0.2705, loss: 0.8689 2024-06-01 01:06:35,613 - mmdet - INFO - Epoch [2][7300/7330] lr: 1.000e-04, eta: 13:14:25, time: 0.784, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0315, loss_rpn_bbox: 0.0510, loss_cls: 0.2297, acc: 92.1155, loss_bbox: 0.2833, loss_mask: 0.2797, loss: 0.8751 2024-06-01 01:06:55,356 - mmdet - INFO - Saving checkpoint at 2 epochs 2024-06-01 01:08:35,778 - mmdet - INFO - Evaluating bbox... 2024-06-01 01:09:02,471 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.374 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.626 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.398 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.205 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.418 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.527 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.495 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.495 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.495 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.287 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.546 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.662 2024-06-01 01:09:02,471 - mmdet - INFO - Evaluating segm... 2024-06-01 01:09:30,374 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.346 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.584 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.355 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.129 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.378 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.565 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.455 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.455 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.455 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.219 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.507 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.666 2024-06-01 01:09:30,841 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 01:09:30,843 - mmdet - INFO - Epoch(val) [2][625] bbox_mAP: 0.3740, bbox_mAP_50: 0.6260, bbox_mAP_75: 0.3980, bbox_mAP_s: 0.2050, bbox_mAP_m: 0.4180, bbox_mAP_l: 0.5270, bbox_mAP_copypaste: 0.374 0.626 0.398 0.205 0.418 0.527, segm_mAP: 0.3460, segm_mAP_50: 0.5840, segm_mAP_75: 0.3550, segm_mAP_s: 0.1290, segm_mAP_m: 0.3780, segm_mAP_l: 0.5650, segm_mAP_copypaste: 0.346 0.584 0.355 0.129 0.378 0.565 2024-06-01 01:10:16,050 - mmdet - INFO - Epoch [3][50/7330] lr: 1.000e-04, eta: 13:13:00, time: 0.904, data_time: 0.107, memory: 13277, loss_rpn_cls: 0.0322, loss_rpn_bbox: 0.0541, loss_cls: 0.2278, acc: 92.0232, loss_bbox: 0.2831, loss_mask: 0.2765, loss: 0.8737 2024-06-01 01:10:47,805 - mmdet - INFO - Epoch [3][100/7330] lr: 1.000e-04, eta: 13:12:23, time: 0.635, data_time: 0.039, memory: 13277, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0499, loss_cls: 0.2253, acc: 92.1760, loss_bbox: 0.2791, loss_mask: 0.2726, loss: 0.8556 2024-06-01 01:11:19,710 - mmdet - INFO - Epoch [3][150/7330] lr: 1.000e-04, eta: 13:11:48, time: 0.638, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0533, loss_cls: 0.2305, acc: 91.9348, loss_bbox: 0.2867, loss_mask: 0.2769, loss: 0.8781 2024-06-01 01:11:51,817 - mmdet - INFO - Epoch [3][200/7330] lr: 1.000e-04, eta: 13:11:14, time: 0.642, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0307, loss_rpn_bbox: 0.0533, loss_cls: 0.2264, acc: 92.0667, loss_bbox: 0.2779, loss_mask: 0.2743, loss: 0.8626 2024-06-01 01:12:23,802 - mmdet - INFO - Epoch [3][250/7330] lr: 1.000e-04, eta: 13:10:39, time: 0.640, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0511, loss_cls: 0.2266, acc: 92.0371, loss_bbox: 0.2850, loss_mask: 0.2740, loss: 0.8645 2024-06-01 01:12:55,826 - mmdet - INFO - Epoch [3][300/7330] lr: 1.000e-04, eta: 13:10:04, time: 0.640, data_time: 0.039, memory: 13277, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0499, loss_cls: 0.2188, acc: 92.1472, loss_bbox: 0.2776, loss_mask: 0.2755, loss: 0.8520 2024-06-01 01:13:27,869 - mmdet - INFO - Epoch [3][350/7330] lr: 1.000e-04, eta: 13:09:30, time: 0.641, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0518, loss_cls: 0.2299, acc: 91.9736, loss_bbox: 0.2810, loss_mask: 0.2741, loss: 0.8670 2024-06-01 01:14:00,069 - mmdet - INFO - Epoch [3][400/7330] lr: 1.000e-04, eta: 13:08:56, time: 0.644, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0533, loss_cls: 0.2289, acc: 92.0222, loss_bbox: 0.2805, loss_mask: 0.2719, loss: 0.8635 2024-06-01 01:14:31,979 - mmdet - INFO - Epoch [3][450/7330] lr: 1.000e-04, eta: 13:08:21, time: 0.638, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0518, loss_cls: 0.2206, acc: 92.2351, loss_bbox: 0.2770, loss_mask: 0.2721, loss: 0.8514 2024-06-01 01:15:04,381 - mmdet - INFO - Epoch [3][500/7330] lr: 1.000e-04, eta: 13:07:48, time: 0.648, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0515, loss_cls: 0.2297, acc: 91.9575, loss_bbox: 0.2873, loss_mask: 0.2747, loss: 0.8732 2024-06-01 01:15:36,465 - mmdet - INFO - Epoch [3][550/7330] lr: 1.000e-04, eta: 13:07:14, time: 0.642, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0503, loss_cls: 0.2228, acc: 92.0754, loss_bbox: 0.2780, loss_mask: 0.2765, loss: 0.8573 2024-06-01 01:16:08,239 - mmdet - INFO - Epoch [3][600/7330] lr: 1.000e-04, eta: 13:06:38, time: 0.635, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0475, loss_cls: 0.2163, acc: 92.4241, loss_bbox: 0.2705, loss_mask: 0.2798, loss: 0.8405 2024-06-01 01:16:40,334 - mmdet - INFO - Epoch [3][650/7330] lr: 1.000e-04, eta: 13:06:04, time: 0.642, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0523, loss_cls: 0.2231, acc: 92.1265, loss_bbox: 0.2807, loss_mask: 0.2691, loss: 0.8561 2024-06-01 01:17:12,040 - mmdet - INFO - Epoch [3][700/7330] lr: 1.000e-04, eta: 13:05:28, time: 0.634, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0511, loss_cls: 0.2177, acc: 92.3186, loss_bbox: 0.2700, loss_mask: 0.2679, loss: 0.8388 2024-06-01 01:17:43,922 - mmdet - INFO - Epoch [3][750/7330] lr: 1.000e-04, eta: 13:04:53, time: 0.638, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0500, loss_cls: 0.2140, acc: 92.3997, loss_bbox: 0.2665, loss_mask: 0.2696, loss: 0.8309 2024-06-01 01:18:15,839 - mmdet - INFO - Epoch [3][800/7330] lr: 1.000e-04, eta: 13:04:18, time: 0.638, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0500, loss_cls: 0.2168, acc: 92.3264, loss_bbox: 0.2722, loss_mask: 0.2669, loss: 0.8334 2024-06-01 01:18:47,817 - mmdet - INFO - Epoch [3][850/7330] lr: 1.000e-04, eta: 13:03:43, time: 0.640, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0528, loss_cls: 0.2289, acc: 92.0645, loss_bbox: 0.2836, loss_mask: 0.2709, loss: 0.8651 2024-06-01 01:19:19,849 - mmdet - INFO - Epoch [3][900/7330] lr: 1.000e-04, eta: 13:03:09, time: 0.641, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0307, loss_rpn_bbox: 0.0516, loss_cls: 0.2216, acc: 92.1270, loss_bbox: 0.2746, loss_mask: 0.2672, loss: 0.8458 2024-06-01 01:19:51,903 - mmdet - INFO - Epoch [3][950/7330] lr: 1.000e-04, eta: 13:02:34, time: 0.641, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0531, loss_cls: 0.2303, acc: 91.7703, loss_bbox: 0.2919, loss_mask: 0.2748, loss: 0.8790 2024-06-01 01:20:23,688 - mmdet - INFO - Epoch [3][1000/7330] lr: 1.000e-04, eta: 13:01:59, time: 0.636, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0518, loss_cls: 0.2226, acc: 92.1025, loss_bbox: 0.2787, loss_mask: 0.2729, loss: 0.8560 2024-06-01 01:20:55,485 - mmdet - INFO - Epoch [3][1050/7330] lr: 1.000e-04, eta: 13:01:23, time: 0.636, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0304, loss_rpn_bbox: 0.0523, loss_cls: 0.2300, acc: 91.9319, loss_bbox: 0.2824, loss_mask: 0.2768, loss: 0.8719 2024-06-01 01:21:31,698 - mmdet - INFO - Epoch [3][1100/7330] lr: 1.000e-04, eta: 13:01:08, time: 0.724, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0464, loss_cls: 0.2236, acc: 92.0823, loss_bbox: 0.2778, loss_mask: 0.2731, loss: 0.8488 2024-06-01 01:22:03,333 - mmdet - INFO - Epoch [3][1150/7330] lr: 1.000e-04, eta: 13:00:32, time: 0.633, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0281, loss_rpn_bbox: 0.0474, loss_cls: 0.2162, acc: 92.5239, loss_bbox: 0.2659, loss_mask: 0.2663, loss: 0.8238 2024-06-01 01:22:41,319 - mmdet - INFO - Epoch [3][1200/7330] lr: 1.000e-04, eta: 13:00:25, time: 0.760, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0503, loss_cls: 0.2276, acc: 92.0754, loss_bbox: 0.2795, loss_mask: 0.2733, loss: 0.8619 2024-06-01 01:23:21,023 - mmdet - INFO - Epoch [3][1250/7330] lr: 1.000e-04, eta: 13:00:25, time: 0.794, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0464, loss_cls: 0.2217, acc: 92.2532, loss_bbox: 0.2807, loss_mask: 0.2704, loss: 0.8456 2024-06-01 01:23:55,858 - mmdet - INFO - Epoch [3][1300/7330] lr: 1.000e-04, eta: 13:00:03, time: 0.697, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0299, loss_rpn_bbox: 0.0508, loss_cls: 0.2190, acc: 92.2002, loss_bbox: 0.2785, loss_mask: 0.2672, loss: 0.8453 2024-06-01 01:24:27,756 - mmdet - INFO - Epoch [3][1350/7330] lr: 1.000e-04, eta: 12:59:28, time: 0.638, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0305, loss_rpn_bbox: 0.0532, loss_cls: 0.2268, acc: 92.0940, loss_bbox: 0.2772, loss_mask: 0.2732, loss: 0.8608 2024-06-01 01:25:00,020 - mmdet - INFO - Epoch [3][1400/7330] lr: 1.000e-04, eta: 12:58:54, time: 0.645, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0537, loss_cls: 0.2342, acc: 91.7410, loss_bbox: 0.2942, loss_mask: 0.2744, loss: 0.8866 2024-06-01 01:25:32,299 - mmdet - INFO - Epoch [3][1450/7330] lr: 1.000e-04, eta: 12:58:21, time: 0.646, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0461, loss_cls: 0.2144, acc: 92.3696, loss_bbox: 0.2703, loss_mask: 0.2641, loss: 0.8234 2024-06-01 01:26:04,178 - mmdet - INFO - Epoch [3][1500/7330] lr: 1.000e-04, eta: 12:57:45, time: 0.638, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0501, loss_cls: 0.2261, acc: 92.1902, loss_bbox: 0.2775, loss_mask: 0.2727, loss: 0.8568 2024-06-01 01:26:36,224 - mmdet - INFO - Epoch [3][1550/7330] lr: 1.000e-04, eta: 12:57:11, time: 0.641, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0496, loss_cls: 0.2253, acc: 92.1001, loss_bbox: 0.2799, loss_mask: 0.2694, loss: 0.8520 2024-06-01 01:27:08,582 - mmdet - INFO - Epoch [3][1600/7330] lr: 1.000e-04, eta: 12:56:38, time: 0.647, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0294, loss_rpn_bbox: 0.0516, loss_cls: 0.2286, acc: 92.1196, loss_bbox: 0.2825, loss_mask: 0.2737, loss: 0.8659 2024-06-01 01:27:40,696 - mmdet - INFO - Epoch [3][1650/7330] lr: 1.000e-04, eta: 12:56:04, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0494, loss_cls: 0.2243, acc: 92.1460, loss_bbox: 0.2798, loss_mask: 0.2724, loss: 0.8547 2024-06-01 01:28:12,640 - mmdet - INFO - Epoch [3][1700/7330] lr: 1.000e-04, eta: 12:55:29, time: 0.639, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0505, loss_cls: 0.2180, acc: 92.3154, loss_bbox: 0.2700, loss_mask: 0.2670, loss: 0.8333 2024-06-01 01:28:44,612 - mmdet - INFO - Epoch [3][1750/7330] lr: 1.000e-04, eta: 12:54:54, time: 0.639, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0305, loss_rpn_bbox: 0.0506, loss_cls: 0.2252, acc: 92.0134, loss_bbox: 0.2789, loss_mask: 0.2729, loss: 0.8580 2024-06-01 01:29:16,504 - mmdet - INFO - Epoch [3][1800/7330] lr: 1.000e-04, eta: 12:54:19, time: 0.638, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0304, loss_rpn_bbox: 0.0498, loss_cls: 0.2239, acc: 92.0371, loss_bbox: 0.2756, loss_mask: 0.2695, loss: 0.8492 2024-06-01 01:29:48,550 - mmdet - INFO - Epoch [3][1850/7330] lr: 1.000e-04, eta: 12:53:45, time: 0.641, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0316, loss_rpn_bbox: 0.0535, loss_cls: 0.2297, acc: 91.9080, loss_bbox: 0.2869, loss_mask: 0.2701, loss: 0.8717 2024-06-01 01:30:20,669 - mmdet - INFO - Epoch [3][1900/7330] lr: 1.000e-04, eta: 12:53:10, time: 0.642, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0485, loss_cls: 0.2139, acc: 92.3582, loss_bbox: 0.2713, loss_mask: 0.2675, loss: 0.8297 2024-06-01 01:30:52,729 - mmdet - INFO - Epoch [3][1950/7330] lr: 1.000e-04, eta: 12:52:36, time: 0.641, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0291, loss_rpn_bbox: 0.0507, loss_cls: 0.2261, acc: 92.0178, loss_bbox: 0.2813, loss_mask: 0.2777, loss: 0.8648 2024-06-01 01:31:24,807 - mmdet - INFO - Epoch [3][2000/7330] lr: 1.000e-04, eta: 12:52:02, time: 0.641, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0291, loss_rpn_bbox: 0.0489, loss_cls: 0.2221, acc: 92.2319, loss_bbox: 0.2752, loss_mask: 0.2636, loss: 0.8389 2024-06-01 01:31:56,892 - mmdet - INFO - Epoch [3][2050/7330] lr: 1.000e-04, eta: 12:51:28, time: 0.642, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0492, loss_cls: 0.2247, acc: 92.1565, loss_bbox: 0.2725, loss_mask: 0.2646, loss: 0.8381 2024-06-01 01:32:29,029 - mmdet - INFO - Epoch [3][2100/7330] lr: 1.000e-04, eta: 12:50:54, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0509, loss_cls: 0.2278, acc: 91.8882, loss_bbox: 0.2895, loss_mask: 0.2758, loss: 0.8719 2024-06-01 01:33:01,169 - mmdet - INFO - Epoch [3][2150/7330] lr: 1.000e-04, eta: 12:50:20, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0506, loss_cls: 0.2241, acc: 92.1426, loss_bbox: 0.2780, loss_mask: 0.2698, loss: 0.8521 2024-06-01 01:33:38,249 - mmdet - INFO - Epoch [3][2200/7330] lr: 1.000e-04, eta: 12:50:07, time: 0.742, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0508, loss_cls: 0.2256, acc: 92.1665, loss_bbox: 0.2781, loss_mask: 0.2705, loss: 0.8535 2024-06-01 01:34:12,765 - mmdet - INFO - Epoch [3][2250/7330] lr: 1.000e-04, eta: 12:49:43, time: 0.690, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0523, loss_cls: 0.2313, acc: 91.7903, loss_bbox: 0.2902, loss_mask: 0.2745, loss: 0.8782 2024-06-01 01:34:52,220 - mmdet - INFO - Epoch [3][2300/7330] lr: 1.000e-04, eta: 12:49:39, time: 0.789, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0490, loss_cls: 0.2265, acc: 92.0354, loss_bbox: 0.2792, loss_mask: 0.2702, loss: 0.8516 2024-06-01 01:35:26,701 - mmdet - INFO - Epoch [3][2350/7330] lr: 1.000e-04, eta: 12:49:15, time: 0.690, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0310, loss_rpn_bbox: 0.0497, loss_cls: 0.2219, acc: 92.1899, loss_bbox: 0.2762, loss_mask: 0.2711, loss: 0.8500 2024-06-01 01:35:58,861 - mmdet - INFO - Epoch [3][2400/7330] lr: 1.000e-04, eta: 12:48:41, time: 0.643, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0487, loss_cls: 0.2245, acc: 92.0317, loss_bbox: 0.2803, loss_mask: 0.2722, loss: 0.8545 2024-06-01 01:36:30,879 - mmdet - INFO - Epoch [3][2450/7330] lr: 1.000e-04, eta: 12:48:06, time: 0.640, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0503, loss_cls: 0.2122, acc: 92.4893, loss_bbox: 0.2636, loss_mask: 0.2704, loss: 0.8223 2024-06-01 01:37:03,195 - mmdet - INFO - Epoch [3][2500/7330] lr: 1.000e-04, eta: 12:47:33, time: 0.646, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0472, loss_cls: 0.2141, acc: 92.3965, loss_bbox: 0.2681, loss_mask: 0.2696, loss: 0.8255 2024-06-01 01:37:35,525 - mmdet - INFO - Epoch [3][2550/7330] lr: 1.000e-04, eta: 12:47:00, time: 0.646, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0299, loss_rpn_bbox: 0.0534, loss_cls: 0.2283, acc: 91.9370, loss_bbox: 0.2837, loss_mask: 0.2754, loss: 0.8708 2024-06-01 01:38:07,790 - mmdet - INFO - Epoch [3][2600/7330] lr: 1.000e-04, eta: 12:46:26, time: 0.645, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0305, loss_rpn_bbox: 0.0507, loss_cls: 0.2306, acc: 91.7856, loss_bbox: 0.2796, loss_mask: 0.2710, loss: 0.8623 2024-06-01 01:38:40,031 - mmdet - INFO - Epoch [3][2650/7330] lr: 1.000e-04, eta: 12:45:52, time: 0.645, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0292, loss_rpn_bbox: 0.0488, loss_cls: 0.2198, acc: 92.3286, loss_bbox: 0.2733, loss_mask: 0.2660, loss: 0.8371 2024-06-01 01:39:12,526 - mmdet - INFO - Epoch [3][2700/7330] lr: 1.000e-04, eta: 12:45:20, time: 0.650, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0511, loss_cls: 0.2284, acc: 92.0251, loss_bbox: 0.2737, loss_mask: 0.2717, loss: 0.8553 2024-06-01 01:39:44,614 - mmdet - INFO - Epoch [3][2750/7330] lr: 1.000e-04, eta: 12:44:45, time: 0.642, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0485, loss_cls: 0.2187, acc: 92.2124, loss_bbox: 0.2765, loss_mask: 0.2688, loss: 0.8390 2024-06-01 01:40:16,923 - mmdet - INFO - Epoch [3][2800/7330] lr: 1.000e-04, eta: 12:44:12, time: 0.646, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0515, loss_cls: 0.2237, acc: 92.1521, loss_bbox: 0.2801, loss_mask: 0.2669, loss: 0.8518 2024-06-01 01:40:49,411 - mmdet - INFO - Epoch [3][2850/7330] lr: 1.000e-04, eta: 12:43:39, time: 0.650, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0532, loss_cls: 0.2253, acc: 91.9604, loss_bbox: 0.2838, loss_mask: 0.2737, loss: 0.8659 2024-06-01 01:41:21,325 - mmdet - INFO - Epoch [3][2900/7330] lr: 1.000e-04, eta: 12:43:04, time: 0.638, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0501, loss_cls: 0.2195, acc: 92.3347, loss_bbox: 0.2681, loss_mask: 0.2736, loss: 0.8389 2024-06-01 01:41:53,467 - mmdet - INFO - Epoch [3][2950/7330] lr: 1.000e-04, eta: 12:42:30, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0522, loss_cls: 0.2239, acc: 92.1179, loss_bbox: 0.2799, loss_mask: 0.2699, loss: 0.8552 2024-06-01 01:42:25,356 - mmdet - INFO - Epoch [3][3000/7330] lr: 1.000e-04, eta: 12:41:55, time: 0.638, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0486, loss_cls: 0.2113, acc: 92.5288, loss_bbox: 0.2639, loss_mask: 0.2654, loss: 0.8161 2024-06-01 01:42:57,332 - mmdet - INFO - Epoch [3][3050/7330] lr: 1.000e-04, eta: 12:41:21, time: 0.639, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0490, loss_cls: 0.2137, acc: 92.5608, loss_bbox: 0.2652, loss_mask: 0.2704, loss: 0.8273 2024-06-01 01:43:29,274 - mmdet - INFO - Epoch [3][3100/7330] lr: 1.000e-04, eta: 12:40:46, time: 0.639, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0291, loss_rpn_bbox: 0.0489, loss_cls: 0.2182, acc: 92.3784, loss_bbox: 0.2713, loss_mask: 0.2711, loss: 0.8387 2024-06-01 01:44:01,583 - mmdet - INFO - Epoch [3][3150/7330] lr: 1.000e-04, eta: 12:40:13, time: 0.646, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0509, loss_cls: 0.2288, acc: 92.0386, loss_bbox: 0.2810, loss_mask: 0.2786, loss: 0.8691 2024-06-01 01:44:33,681 - mmdet - INFO - Epoch [3][3200/7330] lr: 1.000e-04, eta: 12:39:39, time: 0.642, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0496, loss_cls: 0.2190, acc: 92.2000, loss_bbox: 0.2768, loss_mask: 0.2727, loss: 0.8455 2024-06-01 01:45:10,286 - mmdet - INFO - Epoch [3][3250/7330] lr: 1.000e-04, eta: 12:39:22, time: 0.732, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0306, loss_rpn_bbox: 0.0478, loss_cls: 0.2148, acc: 92.4175, loss_bbox: 0.2669, loss_mask: 0.2610, loss: 0.8211 2024-06-01 01:45:45,022 - mmdet - INFO - Epoch [3][3300/7330] lr: 1.000e-04, eta: 12:38:58, time: 0.695, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0502, loss_cls: 0.2167, acc: 92.2712, loss_bbox: 0.2697, loss_mask: 0.2699, loss: 0.8344 2024-06-01 01:46:19,982 - mmdet - INFO - Epoch [3][3350/7330] lr: 1.000e-04, eta: 12:38:35, time: 0.699, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0514, loss_cls: 0.2322, acc: 91.7791, loss_bbox: 0.2824, loss_mask: 0.2674, loss: 0.8636 2024-06-01 01:46:59,249 - mmdet - INFO - Epoch [3][3400/7330] lr: 1.000e-04, eta: 12:38:29, time: 0.785, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0496, loss_cls: 0.2226, acc: 92.2920, loss_bbox: 0.2695, loss_mask: 0.2675, loss: 0.8372 2024-06-01 01:47:31,307 - mmdet - INFO - Epoch [3][3450/7330] lr: 1.000e-04, eta: 12:37:54, time: 0.641, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0492, loss_cls: 0.2215, acc: 92.2002, loss_bbox: 0.2745, loss_mask: 0.2643, loss: 0.8380 2024-06-01 01:48:03,653 - mmdet - INFO - Epoch [3][3500/7330] lr: 1.000e-04, eta: 12:37:21, time: 0.647, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0513, loss_cls: 0.2216, acc: 92.1179, loss_bbox: 0.2825, loss_mask: 0.2745, loss: 0.8593 2024-06-01 01:48:35,943 - mmdet - INFO - Epoch [3][3550/7330] lr: 1.000e-04, eta: 12:36:47, time: 0.646, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0489, loss_cls: 0.2125, acc: 92.5381, loss_bbox: 0.2673, loss_mask: 0.2677, loss: 0.8241 2024-06-01 01:49:08,458 - mmdet - INFO - Epoch [3][3600/7330] lr: 1.000e-04, eta: 12:36:15, time: 0.650, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0509, loss_cls: 0.2309, acc: 91.9963, loss_bbox: 0.2790, loss_mask: 0.2676, loss: 0.8585 2024-06-01 01:49:40,722 - mmdet - INFO - Epoch [3][3650/7330] lr: 1.000e-04, eta: 12:35:41, time: 0.645, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0502, loss_cls: 0.2203, acc: 92.1636, loss_bbox: 0.2706, loss_mask: 0.2736, loss: 0.8422 2024-06-01 01:50:12,613 - mmdet - INFO - Epoch [3][3700/7330] lr: 1.000e-04, eta: 12:35:06, time: 0.638, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0294, loss_rpn_bbox: 0.0512, loss_cls: 0.2263, acc: 92.1404, loss_bbox: 0.2763, loss_mask: 0.2646, loss: 0.8478 2024-06-01 01:50:44,602 - mmdet - INFO - Epoch [3][3750/7330] lr: 1.000e-04, eta: 12:34:31, time: 0.640, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0476, loss_cls: 0.2164, acc: 92.2678, loss_bbox: 0.2728, loss_mask: 0.2691, loss: 0.8329 2024-06-01 01:51:16,655 - mmdet - INFO - Epoch [3][3800/7330] lr: 1.000e-04, eta: 12:33:57, time: 0.641, data_time: 0.046, memory: 13277, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0491, loss_cls: 0.2172, acc: 92.4224, loss_bbox: 0.2669, loss_mask: 0.2630, loss: 0.8259 2024-06-01 01:51:48,627 - mmdet - INFO - Epoch [3][3850/7330] lr: 1.000e-04, eta: 12:33:22, time: 0.639, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0448, loss_cls: 0.2087, acc: 92.6343, loss_bbox: 0.2585, loss_mask: 0.2583, loss: 0.7963 2024-06-01 01:52:20,771 - mmdet - INFO - Epoch [3][3900/7330] lr: 1.000e-04, eta: 12:32:48, time: 0.643, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0509, loss_cls: 0.2201, acc: 92.2463, loss_bbox: 0.2721, loss_mask: 0.2698, loss: 0.8415 2024-06-01 01:52:52,806 - mmdet - INFO - Epoch [3][3950/7330] lr: 1.000e-04, eta: 12:32:14, time: 0.641, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0291, loss_rpn_bbox: 0.0527, loss_cls: 0.2102, acc: 92.5364, loss_bbox: 0.2697, loss_mask: 0.2682, loss: 0.8298 2024-06-01 01:53:24,911 - mmdet - INFO - Epoch [3][4000/7330] lr: 1.000e-04, eta: 12:31:39, time: 0.642, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0491, loss_cls: 0.2120, acc: 92.5833, loss_bbox: 0.2656, loss_mask: 0.2620, loss: 0.8174 2024-06-01 01:53:56,898 - mmdet - INFO - Epoch [3][4050/7330] lr: 1.000e-04, eta: 12:31:05, time: 0.640, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0488, loss_cls: 0.2219, acc: 92.1443, loss_bbox: 0.2752, loss_mask: 0.2745, loss: 0.8481 2024-06-01 01:54:29,053 - mmdet - INFO - Epoch [3][4100/7330] lr: 1.000e-04, eta: 12:30:31, time: 0.643, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0463, loss_cls: 0.2036, acc: 92.7959, loss_bbox: 0.2527, loss_mask: 0.2555, loss: 0.7851 2024-06-01 01:55:01,668 - mmdet - INFO - Epoch [3][4150/7330] lr: 1.000e-04, eta: 12:29:59, time: 0.652, data_time: 0.050, memory: 13277, loss_rpn_cls: 0.0313, loss_rpn_bbox: 0.0512, loss_cls: 0.2209, acc: 92.2175, loss_bbox: 0.2699, loss_mask: 0.2653, loss: 0.8386 2024-06-01 01:55:33,979 - mmdet - INFO - Epoch [3][4200/7330] lr: 1.000e-04, eta: 12:29:25, time: 0.646, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0476, loss_cls: 0.2078, acc: 92.6606, loss_bbox: 0.2620, loss_mask: 0.2627, loss: 0.8081 2024-06-01 01:56:06,489 - mmdet - INFO - Epoch [3][4250/7330] lr: 1.000e-04, eta: 12:28:53, time: 0.650, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0316, loss_rpn_bbox: 0.0512, loss_cls: 0.2194, acc: 92.2295, loss_bbox: 0.2751, loss_mask: 0.2689, loss: 0.8462 2024-06-01 01:56:43,553 - mmdet - INFO - Epoch [3][4300/7330] lr: 1.000e-04, eta: 12:28:37, time: 0.741, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0512, loss_cls: 0.2199, acc: 92.2290, loss_bbox: 0.2709, loss_mask: 0.2701, loss: 0.8388 2024-06-01 01:57:16,254 - mmdet - INFO - Epoch [3][4350/7330] lr: 1.000e-04, eta: 12:28:05, time: 0.654, data_time: 0.041, memory: 13277, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0503, loss_cls: 0.2230, acc: 92.0234, loss_bbox: 0.2786, loss_mask: 0.2631, loss: 0.8422 2024-06-01 01:57:53,426 - mmdet - INFO - Epoch [3][4400/7330] lr: 1.000e-04, eta: 12:27:49, time: 0.743, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0516, loss_cls: 0.2193, acc: 92.3164, loss_bbox: 0.2700, loss_mask: 0.2636, loss: 0.8328 2024-06-01 01:58:31,787 - mmdet - INFO - Epoch [3][4450/7330] lr: 1.000e-04, eta: 12:27:37, time: 0.767, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0469, loss_cls: 0.2053, acc: 92.7749, loss_bbox: 0.2567, loss_mask: 0.2554, loss: 0.7892 2024-06-01 01:59:08,659 - mmdet - INFO - Epoch [3][4500/7330] lr: 1.000e-04, eta: 12:27:20, time: 0.737, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0519, loss_cls: 0.2346, acc: 91.7209, loss_bbox: 0.2883, loss_mask: 0.2698, loss: 0.8723 2024-06-01 01:59:41,036 - mmdet - INFO - Epoch [3][4550/7330] lr: 1.000e-04, eta: 12:26:47, time: 0.648, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0514, loss_cls: 0.2207, acc: 92.2766, loss_bbox: 0.2707, loss_mask: 0.2690, loss: 0.8408 2024-06-01 02:00:13,245 - mmdet - INFO - Epoch [3][4600/7330] lr: 1.000e-04, eta: 12:26:13, time: 0.644, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0479, loss_cls: 0.2157, acc: 92.4343, loss_bbox: 0.2670, loss_mask: 0.2660, loss: 0.8229 2024-06-01 02:00:45,349 - mmdet - INFO - Epoch [3][4650/7330] lr: 1.000e-04, eta: 12:25:38, time: 0.642, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0521, loss_cls: 0.2170, acc: 92.2588, loss_bbox: 0.2736, loss_mask: 0.2741, loss: 0.8466 2024-06-01 02:01:17,387 - mmdet - INFO - Epoch [3][4700/7330] lr: 1.000e-04, eta: 12:25:04, time: 0.641, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0299, loss_rpn_bbox: 0.0523, loss_cls: 0.2234, acc: 92.1956, loss_bbox: 0.2735, loss_mask: 0.2664, loss: 0.8455 2024-06-01 02:01:49,691 - mmdet - INFO - Epoch [3][4750/7330] lr: 1.000e-04, eta: 12:24:30, time: 0.646, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0510, loss_cls: 0.2134, acc: 92.4172, loss_bbox: 0.2707, loss_mask: 0.2731, loss: 0.8360 2024-06-01 02:02:22,181 - mmdet - INFO - Epoch [3][4800/7330] lr: 1.000e-04, eta: 12:23:57, time: 0.650, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0495, loss_cls: 0.2168, acc: 92.3508, loss_bbox: 0.2714, loss_mask: 0.2672, loss: 0.8351 2024-06-01 02:02:54,130 - mmdet - INFO - Epoch [3][4850/7330] lr: 1.000e-04, eta: 12:23:23, time: 0.639, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0485, loss_cls: 0.2210, acc: 92.3057, loss_bbox: 0.2680, loss_mask: 0.2652, loss: 0.8325 2024-06-01 02:03:26,280 - mmdet - INFO - Epoch [3][4900/7330] lr: 1.000e-04, eta: 12:22:48, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0281, loss_rpn_bbox: 0.0484, loss_cls: 0.2168, acc: 92.2983, loss_bbox: 0.2710, loss_mask: 0.2637, loss: 0.8279 2024-06-01 02:03:58,533 - mmdet - INFO - Epoch [3][4950/7330] lr: 1.000e-04, eta: 12:22:15, time: 0.645, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0501, loss_cls: 0.2182, acc: 92.2820, loss_bbox: 0.2750, loss_mask: 0.2642, loss: 0.8358 2024-06-01 02:04:30,465 - mmdet - INFO - Epoch [3][5000/7330] lr: 1.000e-04, eta: 12:21:40, time: 0.639, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0482, loss_cls: 0.2107, acc: 92.6616, loss_bbox: 0.2619, loss_mask: 0.2662, loss: 0.8139 2024-06-01 02:05:02,350 - mmdet - INFO - Epoch [3][5050/7330] lr: 1.000e-04, eta: 12:21:05, time: 0.638, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0304, loss_rpn_bbox: 0.0475, loss_cls: 0.2140, acc: 92.6113, loss_bbox: 0.2607, loss_mask: 0.2620, loss: 0.8145 2024-06-01 02:05:34,304 - mmdet - INFO - Epoch [3][5100/7330] lr: 1.000e-04, eta: 12:20:30, time: 0.639, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0519, loss_cls: 0.2259, acc: 92.0242, loss_bbox: 0.2805, loss_mask: 0.2695, loss: 0.8575 2024-06-01 02:06:06,463 - mmdet - INFO - Epoch [3][5150/7330] lr: 1.000e-04, eta: 12:19:56, time: 0.643, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0514, loss_cls: 0.2217, acc: 92.2537, loss_bbox: 0.2725, loss_mask: 0.2646, loss: 0.8394 2024-06-01 02:06:38,706 - mmdet - INFO - Epoch [3][5200/7330] lr: 1.000e-04, eta: 12:19:23, time: 0.645, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0475, loss_cls: 0.2244, acc: 92.1450, loss_bbox: 0.2734, loss_mask: 0.2721, loss: 0.8435 2024-06-01 02:07:10,766 - mmdet - INFO - Epoch [3][5250/7330] lr: 1.000e-04, eta: 12:18:48, time: 0.641, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0284, loss_rpn_bbox: 0.0470, loss_cls: 0.2124, acc: 92.5166, loss_bbox: 0.2635, loss_mask: 0.2630, loss: 0.8143 2024-06-01 02:07:42,984 - mmdet - INFO - Epoch [3][5300/7330] lr: 1.000e-04, eta: 12:18:14, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0476, loss_cls: 0.2151, acc: 92.2705, loss_bbox: 0.2712, loss_mask: 0.2644, loss: 0.8263 2024-06-01 02:08:15,252 - mmdet - INFO - Epoch [3][5350/7330] lr: 1.000e-04, eta: 12:17:41, time: 0.645, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0504, loss_cls: 0.2205, acc: 92.2620, loss_bbox: 0.2764, loss_mask: 0.2705, loss: 0.8468 2024-06-01 02:08:52,596 - mmdet - INFO - Epoch [3][5400/7330] lr: 1.000e-04, eta: 12:17:24, time: 0.747, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0483, loss_cls: 0.2099, acc: 92.5212, loss_bbox: 0.2626, loss_mask: 0.2630, loss: 0.8129 2024-06-01 02:09:26,927 - mmdet - INFO - Epoch [3][5450/7330] lr: 1.000e-04, eta: 12:16:58, time: 0.687, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0329, loss_rpn_bbox: 0.0497, loss_cls: 0.2209, acc: 92.2129, loss_bbox: 0.2674, loss_mask: 0.2643, loss: 0.8352 2024-06-01 02:10:06,376 - mmdet - INFO - Epoch [3][5500/7330] lr: 1.000e-04, eta: 12:16:48, time: 0.788, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0506, loss_cls: 0.2086, acc: 92.5684, loss_bbox: 0.2639, loss_mask: 0.2651, loss: 0.8154 2024-06-01 02:10:41,885 - mmdet - INFO - Epoch [3][5550/7330] lr: 1.000e-04, eta: 12:16:25, time: 0.711, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0510, loss_cls: 0.2213, acc: 92.1599, loss_bbox: 0.2732, loss_mask: 0.2662, loss: 0.8413 2024-06-01 02:11:14,230 - mmdet - INFO - Epoch [3][5600/7330] lr: 1.000e-04, eta: 12:15:52, time: 0.647, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0503, loss_cls: 0.2163, acc: 92.3022, loss_bbox: 0.2694, loss_mask: 0.2606, loss: 0.8274 2024-06-01 02:11:46,454 - mmdet - INFO - Epoch [3][5650/7330] lr: 1.000e-04, eta: 12:15:18, time: 0.645, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0484, loss_cls: 0.2105, acc: 92.5337, loss_bbox: 0.2644, loss_mask: 0.2550, loss: 0.8052 2024-06-01 02:12:18,851 - mmdet - INFO - Epoch [3][5700/7330] lr: 1.000e-04, eta: 12:14:45, time: 0.648, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0521, loss_cls: 0.2292, acc: 92.0012, loss_bbox: 0.2768, loss_mask: 0.2665, loss: 0.8537 2024-06-01 02:12:51,332 - mmdet - INFO - Epoch [3][5750/7330] lr: 1.000e-04, eta: 12:14:12, time: 0.650, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0305, loss_rpn_bbox: 0.0517, loss_cls: 0.2202, acc: 92.2356, loss_bbox: 0.2699, loss_mask: 0.2670, loss: 0.8393 2024-06-01 02:13:23,666 - mmdet - INFO - Epoch [3][5800/7330] lr: 1.000e-04, eta: 12:13:38, time: 0.646, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0299, loss_rpn_bbox: 0.0493, loss_cls: 0.2216, acc: 92.1196, loss_bbox: 0.2739, loss_mask: 0.2627, loss: 0.8374 2024-06-01 02:13:56,175 - mmdet - INFO - Epoch [3][5850/7330] lr: 1.000e-04, eta: 12:13:05, time: 0.651, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0321, loss_rpn_bbox: 0.0513, loss_cls: 0.2273, acc: 91.9148, loss_bbox: 0.2831, loss_mask: 0.2736, loss: 0.8674 2024-06-01 02:14:28,612 - mmdet - INFO - Epoch [3][5900/7330] lr: 1.000e-04, eta: 12:12:32, time: 0.649, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0485, loss_cls: 0.2183, acc: 92.2998, loss_bbox: 0.2612, loss_mask: 0.2580, loss: 0.8142 2024-06-01 02:15:00,683 - mmdet - INFO - Epoch [3][5950/7330] lr: 1.000e-04, eta: 12:11:58, time: 0.641, data_time: 0.041, memory: 13277, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0502, loss_cls: 0.2220, acc: 92.1809, loss_bbox: 0.2735, loss_mask: 0.2659, loss: 0.8392 2024-06-01 02:15:32,724 - mmdet - INFO - Epoch [3][6000/7330] lr: 1.000e-04, eta: 12:11:23, time: 0.641, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0491, loss_cls: 0.2223, acc: 92.1077, loss_bbox: 0.2755, loss_mask: 0.2707, loss: 0.8465 2024-06-01 02:16:04,888 - mmdet - INFO - Epoch [3][6050/7330] lr: 1.000e-04, eta: 12:10:49, time: 0.643, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0464, loss_cls: 0.2121, acc: 92.6985, loss_bbox: 0.2606, loss_mask: 0.2550, loss: 0.8007 2024-06-01 02:16:37,194 - mmdet - INFO - Epoch [3][6100/7330] lr: 1.000e-04, eta: 12:10:16, time: 0.646, data_time: 0.041, memory: 13277, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0509, loss_cls: 0.2319, acc: 91.9602, loss_bbox: 0.2780, loss_mask: 0.2675, loss: 0.8562 2024-06-01 02:17:09,680 - mmdet - INFO - Epoch [3][6150/7330] lr: 1.000e-04, eta: 12:09:43, time: 0.650, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0521, loss_cls: 0.2235, acc: 92.0461, loss_bbox: 0.2749, loss_mask: 0.2674, loss: 0.8462 2024-06-01 02:17:41,814 - mmdet - INFO - Epoch [3][6200/7330] lr: 1.000e-04, eta: 12:09:09, time: 0.643, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0491, loss_cls: 0.2062, acc: 92.6016, loss_bbox: 0.2635, loss_mask: 0.2607, loss: 0.8053 2024-06-01 02:18:14,137 - mmdet - INFO - Epoch [3][6250/7330] lr: 1.000e-04, eta: 12:08:35, time: 0.646, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0294, loss_rpn_bbox: 0.0504, loss_cls: 0.2283, acc: 91.9502, loss_bbox: 0.2762, loss_mask: 0.2596, loss: 0.8439 2024-06-01 02:18:46,449 - mmdet - INFO - Epoch [3][6300/7330] lr: 1.000e-04, eta: 12:08:02, time: 0.646, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0472, loss_cls: 0.2184, acc: 92.3721, loss_bbox: 0.2665, loss_mask: 0.2631, loss: 0.8219 2024-06-01 02:19:18,381 - mmdet - INFO - Epoch [3][6350/7330] lr: 1.000e-04, eta: 12:07:27, time: 0.638, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0509, loss_cls: 0.2298, acc: 91.9324, loss_bbox: 0.2799, loss_mask: 0.2700, loss: 0.8607 2024-06-01 02:19:50,571 - mmdet - INFO - Epoch [3][6400/7330] lr: 1.000e-04, eta: 12:06:53, time: 0.644, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0471, loss_cls: 0.2182, acc: 92.2937, loss_bbox: 0.2665, loss_mask: 0.2590, loss: 0.8192 2024-06-01 02:20:26,846 - mmdet - INFO - Epoch [3][6450/7330] lr: 1.000e-04, eta: 12:06:32, time: 0.726, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0464, loss_cls: 0.2050, acc: 92.7390, loss_bbox: 0.2550, loss_mask: 0.2568, loss: 0.7902 2024-06-01 02:21:01,794 - mmdet - INFO - Epoch [3][6500/7330] lr: 1.000e-04, eta: 12:06:07, time: 0.699, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0460, loss_cls: 0.2220, acc: 92.1821, loss_bbox: 0.2728, loss_mask: 0.2640, loss: 0.8302 2024-06-01 02:21:37,249 - mmdet - INFO - Epoch [3][6550/7330] lr: 1.000e-04, eta: 12:05:43, time: 0.709, data_time: 0.042, memory: 13277, loss_rpn_cls: 0.0289, loss_rpn_bbox: 0.0481, loss_cls: 0.2207, acc: 92.1506, loss_bbox: 0.2753, loss_mask: 0.2682, loss: 0.8413 2024-06-01 02:22:16,979 - mmdet - INFO - Epoch [3][6600/7330] lr: 1.000e-04, eta: 12:05:33, time: 0.795, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0489, loss_cls: 0.2108, acc: 92.5090, loss_bbox: 0.2701, loss_mask: 0.2635, loss: 0.8203 2024-06-01 02:22:49,428 - mmdet - INFO - Epoch [3][6650/7330] lr: 1.000e-04, eta: 12:05:00, time: 0.649, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0465, loss_cls: 0.2038, acc: 92.7517, loss_bbox: 0.2538, loss_mask: 0.2596, loss: 0.7892 2024-06-01 02:23:21,346 - mmdet - INFO - Epoch [3][6700/7330] lr: 1.000e-04, eta: 12:04:25, time: 0.638, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0469, loss_cls: 0.2049, acc: 92.7419, loss_bbox: 0.2559, loss_mask: 0.2624, loss: 0.7983 2024-06-01 02:23:53,602 - mmdet - INFO - Epoch [3][6750/7330] lr: 1.000e-04, eta: 12:03:51, time: 0.645, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0502, loss_cls: 0.2157, acc: 92.2634, loss_bbox: 0.2771, loss_mask: 0.2621, loss: 0.8322 2024-06-01 02:24:25,815 - mmdet - INFO - Epoch [3][6800/7330] lr: 1.000e-04, eta: 12:03:17, time: 0.644, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0491, loss_cls: 0.2111, acc: 92.4370, loss_bbox: 0.2717, loss_mask: 0.2655, loss: 0.8245 2024-06-01 02:24:57,907 - mmdet - INFO - Epoch [3][6850/7330] lr: 1.000e-04, eta: 12:02:43, time: 0.642, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0284, loss_rpn_bbox: 0.0494, loss_cls: 0.2154, acc: 92.4216, loss_bbox: 0.2641, loss_mask: 0.2654, loss: 0.8228 2024-06-01 02:25:30,276 - mmdet - INFO - Epoch [3][6900/7330] lr: 1.000e-04, eta: 12:02:09, time: 0.647, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0514, loss_cls: 0.2216, acc: 92.2964, loss_bbox: 0.2729, loss_mask: 0.2585, loss: 0.8326 2024-06-01 02:26:02,480 - mmdet - INFO - Epoch [3][6950/7330] lr: 1.000e-04, eta: 12:01:35, time: 0.644, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0478, loss_cls: 0.2067, acc: 92.6763, loss_bbox: 0.2564, loss_mask: 0.2602, loss: 0.7962 2024-06-01 02:26:35,051 - mmdet - INFO - Epoch [3][7000/7330] lr: 1.000e-04, eta: 12:01:03, time: 0.651, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0294, loss_rpn_bbox: 0.0535, loss_cls: 0.2290, acc: 91.9397, loss_bbox: 0.2832, loss_mask: 0.2734, loss: 0.8685 2024-06-01 02:27:06,939 - mmdet - INFO - Epoch [3][7050/7330] lr: 1.000e-04, eta: 12:00:28, time: 0.638, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0489, loss_cls: 0.2137, acc: 92.4160, loss_bbox: 0.2626, loss_mask: 0.2625, loss: 0.8125 2024-06-01 02:27:38,721 - mmdet - INFO - Epoch [3][7100/7330] lr: 1.000e-04, eta: 11:59:53, time: 0.636, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0485, loss_cls: 0.2062, acc: 92.7896, loss_bbox: 0.2553, loss_mask: 0.2646, loss: 0.8017 2024-06-01 02:28:11,093 - mmdet - INFO - Epoch [3][7150/7330] lr: 1.000e-04, eta: 11:59:19, time: 0.647, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0483, loss_cls: 0.2133, acc: 92.5935, loss_bbox: 0.2639, loss_mask: 0.2636, loss: 0.8188 2024-06-01 02:28:43,412 - mmdet - INFO - Epoch [3][7200/7330] lr: 1.000e-04, eta: 11:58:46, time: 0.646, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0481, loss_cls: 0.2049, acc: 92.6445, loss_bbox: 0.2539, loss_mask: 0.2563, loss: 0.7921 2024-06-01 02:29:15,683 - mmdet - INFO - Epoch [3][7250/7330] lr: 1.000e-04, eta: 11:58:12, time: 0.645, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0500, loss_cls: 0.2165, acc: 92.2590, loss_bbox: 0.2667, loss_mask: 0.2619, loss: 0.8221 2024-06-01 02:29:47,915 - mmdet - INFO - Epoch [3][7300/7330] lr: 1.000e-04, eta: 11:57:38, time: 0.645, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0476, loss_cls: 0.2194, acc: 92.2812, loss_bbox: 0.2708, loss_mask: 0.2586, loss: 0.8249 2024-06-01 02:30:08,288 - mmdet - INFO - Saving checkpoint at 3 epochs 2024-06-01 02:31:53,104 - mmdet - INFO - Evaluating bbox... 2024-06-01 02:32:17,669 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.407 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.656 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.442 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.234 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.451 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.559 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.536 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.536 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.536 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.337 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.587 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.700 2024-06-01 02:32:17,669 - mmdet - INFO - Evaluating segm... 2024-06-01 02:32:45,459 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.375 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.620 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.394 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.161 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.410 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.598 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.492 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.492 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.492 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.269 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.547 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.694 2024-06-01 02:32:45,997 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 02:32:45,999 - mmdet - INFO - Epoch(val) [3][625] bbox_mAP: 0.4070, bbox_mAP_50: 0.6560, bbox_mAP_75: 0.4420, bbox_mAP_s: 0.2340, bbox_mAP_m: 0.4510, bbox_mAP_l: 0.5590, bbox_mAP_copypaste: 0.407 0.656 0.442 0.234 0.451 0.559, segm_mAP: 0.3750, segm_mAP_50: 0.6200, segm_mAP_75: 0.3940, segm_mAP_s: 0.1610, segm_mAP_m: 0.4100, segm_mAP_l: 0.5980, segm_mAP_copypaste: 0.375 0.620 0.394 0.161 0.410 0.598 2024-06-01 02:33:28,375 - mmdet - INFO - Epoch [4][50/7330] lr: 1.000e-04, eta: 11:56:17, time: 0.847, data_time: 0.107, memory: 13277, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0502, loss_cls: 0.2084, acc: 92.4329, loss_bbox: 0.2679, loss_mask: 0.2595, loss: 0.8120 2024-06-01 02:34:00,525 - mmdet - INFO - Epoch [4][100/7330] lr: 1.000e-04, eta: 11:55:43, time: 0.643, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0488, loss_cls: 0.2095, acc: 92.6094, loss_bbox: 0.2638, loss_mask: 0.2549, loss: 0.8040 2024-06-01 02:34:32,406 - mmdet - INFO - Epoch [4][150/7330] lr: 1.000e-04, eta: 11:55:08, time: 0.637, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0464, loss_cls: 0.1993, acc: 92.9189, loss_bbox: 0.2555, loss_mask: 0.2596, loss: 0.7839 2024-06-01 02:35:09,307 - mmdet - INFO - Epoch [4][200/7330] lr: 1.000e-04, eta: 11:54:48, time: 0.738, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0466, loss_cls: 0.2015, acc: 92.7954, loss_bbox: 0.2602, loss_mask: 0.2623, loss: 0.7949 2024-06-01 02:35:41,856 - mmdet - INFO - Epoch [4][250/7330] lr: 1.000e-04, eta: 11:54:15, time: 0.651, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0522, loss_cls: 0.2134, acc: 92.2803, loss_bbox: 0.2767, loss_mask: 0.2637, loss: 0.8336 2024-06-01 02:36:14,533 - mmdet - INFO - Epoch [4][300/7330] lr: 1.000e-04, eta: 11:53:43, time: 0.653, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0473, loss_cls: 0.2153, acc: 92.1624, loss_bbox: 0.2693, loss_mask: 0.2584, loss: 0.8137 2024-06-01 02:36:46,706 - mmdet - INFO - Epoch [4][350/7330] lr: 1.000e-04, eta: 11:53:09, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0482, loss_cls: 0.2118, acc: 92.4441, loss_bbox: 0.2625, loss_mask: 0.2614, loss: 0.8104 2024-06-01 02:37:19,046 - mmdet - INFO - Epoch [4][400/7330] lr: 1.000e-04, eta: 11:52:36, time: 0.647, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0469, loss_cls: 0.2051, acc: 92.6013, loss_bbox: 0.2633, loss_mask: 0.2547, loss: 0.7961 2024-06-01 02:37:51,276 - mmdet - INFO - Epoch [4][450/7330] lr: 1.000e-04, eta: 11:52:02, time: 0.645, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0459, loss_cls: 0.2147, acc: 92.3071, loss_bbox: 0.2703, loss_mask: 0.2556, loss: 0.8119 2024-06-01 02:38:23,235 - mmdet - INFO - Epoch [4][500/7330] lr: 1.000e-04, eta: 11:51:27, time: 0.639, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0455, loss_cls: 0.1997, acc: 92.8391, loss_bbox: 0.2522, loss_mask: 0.2560, loss: 0.7785 2024-06-01 02:38:55,883 - mmdet - INFO - Epoch [4][550/7330] lr: 1.000e-04, eta: 11:50:55, time: 0.653, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0471, loss_cls: 0.2054, acc: 92.6643, loss_bbox: 0.2597, loss_mask: 0.2566, loss: 0.7950 2024-06-01 02:39:27,972 - mmdet - INFO - Epoch [4][600/7330] lr: 1.000e-04, eta: 11:50:21, time: 0.642, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0502, loss_cls: 0.2152, acc: 92.2354, loss_bbox: 0.2730, loss_mask: 0.2560, loss: 0.8200 2024-06-01 02:40:00,349 - mmdet - INFO - Epoch [4][650/7330] lr: 1.000e-04, eta: 11:49:48, time: 0.648, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0463, loss_cls: 0.2045, acc: 92.6680, loss_bbox: 0.2576, loss_mask: 0.2604, loss: 0.7937 2024-06-01 02:40:32,307 - mmdet - INFO - Epoch [4][700/7330] lr: 1.000e-04, eta: 11:49:13, time: 0.639, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0458, loss_cls: 0.2046, acc: 92.7083, loss_bbox: 0.2609, loss_mask: 0.2531, loss: 0.7881 2024-06-01 02:41:04,711 - mmdet - INFO - Epoch [4][750/7330] lr: 1.000e-04, eta: 11:48:40, time: 0.648, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0497, loss_cls: 0.2049, acc: 92.5549, loss_bbox: 0.2621, loss_mask: 0.2559, loss: 0.7970 2024-06-01 02:41:37,191 - mmdet - INFO - Epoch [4][800/7330] lr: 1.000e-04, eta: 11:48:07, time: 0.650, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0466, loss_cls: 0.2103, acc: 92.4473, loss_bbox: 0.2703, loss_mask: 0.2577, loss: 0.8092 2024-06-01 02:42:09,281 - mmdet - INFO - Epoch [4][850/7330] lr: 1.000e-04, eta: 11:47:33, time: 0.642, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0471, loss_cls: 0.2091, acc: 92.4592, loss_bbox: 0.2616, loss_mask: 0.2566, loss: 0.7999 2024-06-01 02:42:41,537 - mmdet - INFO - Epoch [4][900/7330] lr: 1.000e-04, eta: 11:46:59, time: 0.645, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0482, loss_cls: 0.2099, acc: 92.4382, loss_bbox: 0.2688, loss_mask: 0.2635, loss: 0.8165 2024-06-01 02:43:13,883 - mmdet - INFO - Epoch [4][950/7330] lr: 1.000e-04, eta: 11:46:26, time: 0.646, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0465, loss_cls: 0.2020, acc: 92.7261, loss_bbox: 0.2537, loss_mask: 0.2552, loss: 0.7805 2024-06-01 02:43:46,251 - mmdet - INFO - Epoch [4][1000/7330] lr: 1.000e-04, eta: 11:45:53, time: 0.648, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0485, loss_cls: 0.2048, acc: 92.6531, loss_bbox: 0.2605, loss_mask: 0.2581, loss: 0.7973 2024-06-01 02:44:18,673 - mmdet - INFO - Epoch [4][1050/7330] lr: 1.000e-04, eta: 11:45:20, time: 0.649, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0507, loss_cls: 0.2120, acc: 92.4404, loss_bbox: 0.2645, loss_mask: 0.2549, loss: 0.8083 2024-06-01 02:44:53,392 - mmdet - INFO - Epoch [4][1100/7330] lr: 1.000e-04, eta: 11:44:53, time: 0.694, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0474, loss_cls: 0.2035, acc: 92.6838, loss_bbox: 0.2601, loss_mask: 0.2520, loss: 0.7877 2024-06-01 02:45:25,635 - mmdet - INFO - Epoch [4][1150/7330] lr: 1.000e-04, eta: 11:44:20, time: 0.645, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0472, loss_cls: 0.2033, acc: 92.6570, loss_bbox: 0.2612, loss_mask: 0.2551, loss: 0.7912 2024-06-01 02:46:00,898 - mmdet - INFO - Epoch [4][1200/7330] lr: 1.000e-04, eta: 11:43:54, time: 0.705, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0502, loss_cls: 0.2123, acc: 92.3799, loss_bbox: 0.2641, loss_mask: 0.2517, loss: 0.8042 2024-06-01 02:46:37,997 - mmdet - INFO - Epoch [4][1250/7330] lr: 1.000e-04, eta: 11:43:34, time: 0.742, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0448, loss_cls: 0.1972, acc: 92.9189, loss_bbox: 0.2503, loss_mask: 0.2495, loss: 0.7665 2024-06-01 02:47:17,583 - mmdet - INFO - Epoch [4][1300/7330] lr: 1.000e-04, eta: 11:43:21, time: 0.792, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0476, loss_cls: 0.1967, acc: 92.9150, loss_bbox: 0.2500, loss_mask: 0.2537, loss: 0.7732 2024-06-01 02:47:49,731 - mmdet - INFO - Epoch [4][1350/7330] lr: 1.000e-04, eta: 11:42:47, time: 0.643, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0465, loss_cls: 0.2062, acc: 92.6316, loss_bbox: 0.2613, loss_mask: 0.2593, loss: 0.7997 2024-06-01 02:48:22,025 - mmdet - INFO - Epoch [4][1400/7330] lr: 1.000e-04, eta: 11:42:14, time: 0.646, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0456, loss_cls: 0.1967, acc: 92.8889, loss_bbox: 0.2564, loss_mask: 0.2585, loss: 0.7833 2024-06-01 02:48:56,898 - mmdet - INFO - Epoch [4][1450/7330] lr: 1.000e-04, eta: 11:41:47, time: 0.697, data_time: 0.043, memory: 13277, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0457, loss_cls: 0.2094, acc: 92.4426, loss_bbox: 0.2633, loss_mask: 0.2623, loss: 0.8062 2024-06-01 02:49:29,359 - mmdet - INFO - Epoch [4][1500/7330] lr: 1.000e-04, eta: 11:41:14, time: 0.649, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0463, loss_cls: 0.2069, acc: 92.5510, loss_bbox: 0.2611, loss_mask: 0.2600, loss: 0.7976 2024-06-01 02:50:01,541 - mmdet - INFO - Epoch [4][1550/7330] lr: 1.000e-04, eta: 11:40:40, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0453, loss_cls: 0.2031, acc: 92.8372, loss_bbox: 0.2547, loss_mask: 0.2628, loss: 0.7898 2024-06-01 02:50:33,730 - mmdet - INFO - Epoch [4][1600/7330] lr: 1.000e-04, eta: 11:40:06, time: 0.644, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0489, loss_cls: 0.2140, acc: 92.3696, loss_bbox: 0.2661, loss_mask: 0.2561, loss: 0.8114 2024-06-01 02:51:05,689 - mmdet - INFO - Epoch [4][1650/7330] lr: 1.000e-04, eta: 11:39:32, time: 0.639, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0448, loss_cls: 0.2051, acc: 92.7241, loss_bbox: 0.2587, loss_mask: 0.2512, loss: 0.7837 2024-06-01 02:51:37,907 - mmdet - INFO - Epoch [4][1700/7330] lr: 1.000e-04, eta: 11:38:58, time: 0.644, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0481, loss_cls: 0.2130, acc: 92.4099, loss_bbox: 0.2687, loss_mask: 0.2634, loss: 0.8183 2024-06-01 02:52:10,316 - mmdet - INFO - Epoch [4][1750/7330] lr: 1.000e-04, eta: 11:38:25, time: 0.648, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0452, loss_cls: 0.2099, acc: 92.6152, loss_bbox: 0.2603, loss_mask: 0.2603, loss: 0.7998 2024-06-01 02:52:42,949 - mmdet - INFO - Epoch [4][1800/7330] lr: 1.000e-04, eta: 11:37:52, time: 0.653, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0478, loss_cls: 0.2083, acc: 92.5227, loss_bbox: 0.2625, loss_mask: 0.2632, loss: 0.8065 2024-06-01 02:53:15,141 - mmdet - INFO - Epoch [4][1850/7330] lr: 1.000e-04, eta: 11:37:19, time: 0.644, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0445, loss_cls: 0.2057, acc: 92.6335, loss_bbox: 0.2574, loss_mask: 0.2535, loss: 0.7861 2024-06-01 02:53:47,255 - mmdet - INFO - Epoch [4][1900/7330] lr: 1.000e-04, eta: 11:36:45, time: 0.642, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0478, loss_cls: 0.2096, acc: 92.4941, loss_bbox: 0.2651, loss_mask: 0.2612, loss: 0.8096 2024-06-01 02:54:19,251 - mmdet - INFO - Epoch [4][1950/7330] lr: 1.000e-04, eta: 11:36:10, time: 0.640, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0499, loss_cls: 0.2225, acc: 92.2437, loss_bbox: 0.2717, loss_mask: 0.2602, loss: 0.8308 2024-06-01 02:54:51,903 - mmdet - INFO - Epoch [4][2000/7330] lr: 1.000e-04, eta: 11:35:38, time: 0.653, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0484, loss_cls: 0.2104, acc: 92.4009, loss_bbox: 0.2643, loss_mask: 0.2582, loss: 0.8089 2024-06-01 02:55:24,034 - mmdet - INFO - Epoch [4][2050/7330] lr: 1.000e-04, eta: 11:35:04, time: 0.643, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0474, loss_cls: 0.2011, acc: 92.8359, loss_bbox: 0.2553, loss_mask: 0.2563, loss: 0.7850 2024-06-01 02:55:56,176 - mmdet - INFO - Epoch [4][2100/7330] lr: 1.000e-04, eta: 11:34:30, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0458, loss_cls: 0.2075, acc: 92.5845, loss_bbox: 0.2621, loss_mask: 0.2603, loss: 0.8004 2024-06-01 02:56:28,522 - mmdet - INFO - Epoch [4][2150/7330] lr: 1.000e-04, eta: 11:33:57, time: 0.647, data_time: 0.042, memory: 13277, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0496, loss_cls: 0.2145, acc: 92.3047, loss_bbox: 0.2648, loss_mask: 0.2609, loss: 0.8157 2024-06-01 02:57:03,031 - mmdet - INFO - Epoch [4][2200/7330] lr: 1.000e-04, eta: 11:33:29, time: 0.690, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0467, loss_cls: 0.2038, acc: 92.6296, loss_bbox: 0.2599, loss_mask: 0.2519, loss: 0.7859 2024-06-01 02:57:35,396 - mmdet - INFO - Epoch [4][2250/7330] lr: 1.000e-04, eta: 11:32:56, time: 0.647, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0479, loss_cls: 0.2127, acc: 92.4148, loss_bbox: 0.2629, loss_mask: 0.2510, loss: 0.8001 2024-06-01 02:58:11,519 - mmdet - INFO - Epoch [4][2300/7330] lr: 1.000e-04, eta: 11:32:32, time: 0.722, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0489, loss_cls: 0.2061, acc: 92.5730, loss_bbox: 0.2608, loss_mask: 0.2579, loss: 0.7998 2024-06-01 02:58:51,813 - mmdet - INFO - Epoch [4][2350/7330] lr: 1.000e-04, eta: 11:32:20, time: 0.806, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0449, loss_cls: 0.2017, acc: 92.7148, loss_bbox: 0.2574, loss_mask: 0.2524, loss: 0.7803 2024-06-01 02:59:28,799 - mmdet - INFO - Epoch [4][2400/7330] lr: 1.000e-04, eta: 11:31:58, time: 0.740, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0488, loss_cls: 0.2069, acc: 92.6479, loss_bbox: 0.2632, loss_mask: 0.2594, loss: 0.8040 2024-06-01 03:00:01,437 - mmdet - INFO - Epoch [4][2450/7330] lr: 1.000e-04, eta: 11:31:25, time: 0.653, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0481, loss_cls: 0.2135, acc: 92.3960, loss_bbox: 0.2664, loss_mask: 0.2597, loss: 0.8139 2024-06-01 03:00:36,328 - mmdet - INFO - Epoch [4][2500/7330] lr: 1.000e-04, eta: 11:30:59, time: 0.698, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0470, loss_cls: 0.2036, acc: 92.6577, loss_bbox: 0.2598, loss_mask: 0.2573, loss: 0.7924 2024-06-01 03:01:08,563 - mmdet - INFO - Epoch [4][2550/7330] lr: 1.000e-04, eta: 11:30:25, time: 0.645, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0471, loss_cls: 0.2079, acc: 92.4312, loss_bbox: 0.2613, loss_mask: 0.2502, loss: 0.7917 2024-06-01 03:01:41,272 - mmdet - INFO - Epoch [4][2600/7330] lr: 1.000e-04, eta: 11:29:52, time: 0.654, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0477, loss_cls: 0.2141, acc: 92.4521, loss_bbox: 0.2674, loss_mask: 0.2557, loss: 0.8115 2024-06-01 03:02:13,345 - mmdet - INFO - Epoch [4][2650/7330] lr: 1.000e-04, eta: 11:29:18, time: 0.641, data_time: 0.043, memory: 13277, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0467, loss_cls: 0.2081, acc: 92.5483, loss_bbox: 0.2596, loss_mask: 0.2585, loss: 0.7981 2024-06-01 03:02:45,465 - mmdet - INFO - Epoch [4][2700/7330] lr: 1.000e-04, eta: 11:28:44, time: 0.642, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0462, loss_cls: 0.2036, acc: 92.6978, loss_bbox: 0.2569, loss_mask: 0.2584, loss: 0.7899 2024-06-01 03:03:17,475 - mmdet - INFO - Epoch [4][2750/7330] lr: 1.000e-04, eta: 11:28:10, time: 0.640, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0487, loss_cls: 0.2124, acc: 92.2542, loss_bbox: 0.2706, loss_mask: 0.2666, loss: 0.8223 2024-06-01 03:03:49,681 - mmdet - INFO - Epoch [4][2800/7330] lr: 1.000e-04, eta: 11:27:36, time: 0.644, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0450, loss_cls: 0.1947, acc: 93.0774, loss_bbox: 0.2474, loss_mask: 0.2534, loss: 0.7662 2024-06-01 03:04:22,243 - mmdet - INFO - Epoch [4][2850/7330] lr: 1.000e-04, eta: 11:27:03, time: 0.651, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0479, loss_cls: 0.2149, acc: 92.2261, loss_bbox: 0.2706, loss_mask: 0.2664, loss: 0.8251 2024-06-01 03:04:54,410 - mmdet - INFO - Epoch [4][2900/7330] lr: 1.000e-04, eta: 11:26:29, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0455, loss_cls: 0.1998, acc: 92.9128, loss_bbox: 0.2513, loss_mask: 0.2552, loss: 0.7786 2024-06-01 03:05:26,523 - mmdet - INFO - Epoch [4][2950/7330] lr: 1.000e-04, eta: 11:25:55, time: 0.642, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0471, loss_cls: 0.2077, acc: 92.5190, loss_bbox: 0.2606, loss_mask: 0.2560, loss: 0.7979 2024-06-01 03:05:58,493 - mmdet - INFO - Epoch [4][3000/7330] lr: 1.000e-04, eta: 11:25:21, time: 0.639, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0440, loss_cls: 0.1937, acc: 92.9978, loss_bbox: 0.2468, loss_mask: 0.2450, loss: 0.7524 2024-06-01 03:06:30,804 - mmdet - INFO - Epoch [4][3050/7330] lr: 1.000e-04, eta: 11:24:47, time: 0.646, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0497, loss_cls: 0.1989, acc: 92.7961, loss_bbox: 0.2561, loss_mask: 0.2502, loss: 0.7820 2024-06-01 03:07:02,750 - mmdet - INFO - Epoch [4][3100/7330] lr: 1.000e-04, eta: 11:24:13, time: 0.639, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0492, loss_cls: 0.2040, acc: 92.7114, loss_bbox: 0.2542, loss_mask: 0.2523, loss: 0.7869 2024-06-01 03:07:34,851 - mmdet - INFO - Epoch [4][3150/7330] lr: 1.000e-04, eta: 11:23:39, time: 0.642, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0464, loss_cls: 0.2063, acc: 92.4673, loss_bbox: 0.2612, loss_mask: 0.2557, loss: 0.7933 2024-06-01 03:08:06,954 - mmdet - INFO - Epoch [4][3200/7330] lr: 1.000e-04, eta: 11:23:05, time: 0.642, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0445, loss_cls: 0.2035, acc: 92.7893, loss_bbox: 0.2493, loss_mask: 0.2496, loss: 0.7718 2024-06-01 03:08:41,260 - mmdet - INFO - Epoch [4][3250/7330] lr: 1.000e-04, eta: 11:22:36, time: 0.686, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0473, loss_cls: 0.2095, acc: 92.4014, loss_bbox: 0.2636, loss_mask: 0.2619, loss: 0.8093 2024-06-01 03:09:13,125 - mmdet - INFO - Epoch [4][3300/7330] lr: 1.000e-04, eta: 11:22:02, time: 0.637, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0466, loss_cls: 0.1991, acc: 92.8247, loss_bbox: 0.2528, loss_mask: 0.2565, loss: 0.7799 2024-06-01 03:09:48,113 - mmdet - INFO - Epoch [4][3350/7330] lr: 1.000e-04, eta: 11:21:35, time: 0.700, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0488, loss_cls: 0.2101, acc: 92.4084, loss_bbox: 0.2665, loss_mask: 0.2584, loss: 0.8116 2024-06-01 03:10:28,049 - mmdet - INFO - Epoch [4][3400/7330] lr: 1.000e-04, eta: 11:21:20, time: 0.799, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0492, loss_cls: 0.2064, acc: 92.5701, loss_bbox: 0.2602, loss_mask: 0.2532, loss: 0.7931 2024-06-01 03:11:07,092 - mmdet - INFO - Epoch [4][3450/7330] lr: 1.000e-04, eta: 11:21:03, time: 0.781, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0469, loss_cls: 0.2007, acc: 92.8474, loss_bbox: 0.2546, loss_mask: 0.2539, loss: 0.7815 2024-06-01 03:11:39,509 - mmdet - INFO - Epoch [4][3500/7330] lr: 1.000e-04, eta: 11:20:30, time: 0.648, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0500, loss_cls: 0.2093, acc: 92.3933, loss_bbox: 0.2657, loss_mask: 0.2606, loss: 0.8121 2024-06-01 03:12:14,679 - mmdet - INFO - Epoch [4][3550/7330] lr: 1.000e-04, eta: 11:20:03, time: 0.703, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0494, loss_cls: 0.2074, acc: 92.5481, loss_bbox: 0.2639, loss_mask: 0.2528, loss: 0.7988 2024-06-01 03:12:47,173 - mmdet - INFO - Epoch [4][3600/7330] lr: 1.000e-04, eta: 11:19:30, time: 0.650, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0468, loss_cls: 0.2034, acc: 92.7664, loss_bbox: 0.2555, loss_mask: 0.2546, loss: 0.7854 2024-06-01 03:13:19,174 - mmdet - INFO - Epoch [4][3650/7330] lr: 1.000e-04, eta: 11:18:56, time: 0.640, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0486, loss_cls: 0.2090, acc: 92.5105, loss_bbox: 0.2618, loss_mask: 0.2571, loss: 0.8022 2024-06-01 03:13:51,523 - mmdet - INFO - Epoch [4][3700/7330] lr: 1.000e-04, eta: 11:18:22, time: 0.647, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0462, loss_cls: 0.2043, acc: 92.6831, loss_bbox: 0.2588, loss_mask: 0.2537, loss: 0.7877 2024-06-01 03:14:23,746 - mmdet - INFO - Epoch [4][3750/7330] lr: 1.000e-04, eta: 11:17:48, time: 0.645, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0463, loss_cls: 0.2088, acc: 92.4041, loss_bbox: 0.2607, loss_mask: 0.2531, loss: 0.7948 2024-06-01 03:14:56,237 - mmdet - INFO - Epoch [4][3800/7330] lr: 1.000e-04, eta: 11:17:15, time: 0.650, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0477, loss_cls: 0.2027, acc: 92.8298, loss_bbox: 0.2564, loss_mask: 0.2547, loss: 0.7894 2024-06-01 03:15:28,124 - mmdet - INFO - Epoch [4][3850/7330] lr: 1.000e-04, eta: 11:16:41, time: 0.638, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0472, loss_cls: 0.2001, acc: 92.6970, loss_bbox: 0.2558, loss_mask: 0.2570, loss: 0.7856 2024-06-01 03:16:00,354 - mmdet - INFO - Epoch [4][3900/7330] lr: 1.000e-04, eta: 11:16:07, time: 0.645, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0488, loss_cls: 0.2135, acc: 92.3882, loss_bbox: 0.2668, loss_mask: 0.2623, loss: 0.8162 2024-06-01 03:16:32,254 - mmdet - INFO - Epoch [4][3950/7330] lr: 1.000e-04, eta: 11:15:32, time: 0.638, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0473, loss_cls: 0.1993, acc: 92.7927, loss_bbox: 0.2544, loss_mask: 0.2575, loss: 0.7844 2024-06-01 03:17:04,418 - mmdet - INFO - Epoch [4][4000/7330] lr: 1.000e-04, eta: 11:14:59, time: 0.643, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0481, loss_cls: 0.2019, acc: 92.7148, loss_bbox: 0.2584, loss_mask: 0.2506, loss: 0.7828 2024-06-01 03:17:36,599 - mmdet - INFO - Epoch [4][4050/7330] lr: 1.000e-04, eta: 11:14:25, time: 0.644, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0440, loss_cls: 0.1917, acc: 93.1182, loss_bbox: 0.2448, loss_mask: 0.2552, loss: 0.7589 2024-06-01 03:18:09,175 - mmdet - INFO - Epoch [4][4100/7330] lr: 1.000e-04, eta: 11:13:52, time: 0.652, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0459, loss_cls: 0.2038, acc: 92.6094, loss_bbox: 0.2569, loss_mask: 0.2536, loss: 0.7852 2024-06-01 03:18:41,410 - mmdet - INFO - Epoch [4][4150/7330] lr: 1.000e-04, eta: 11:13:18, time: 0.645, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0490, loss_cls: 0.2181, acc: 92.1724, loss_bbox: 0.2708, loss_mask: 0.2692, loss: 0.8328 2024-06-01 03:19:13,725 - mmdet - INFO - Epoch [4][4200/7330] lr: 1.000e-04, eta: 11:12:45, time: 0.646, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0476, loss_cls: 0.1998, acc: 92.8130, loss_bbox: 0.2540, loss_mask: 0.2597, loss: 0.7862 2024-06-01 03:19:45,636 - mmdet - INFO - Epoch [4][4250/7330] lr: 1.000e-04, eta: 11:12:10, time: 0.638, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0469, loss_cls: 0.2087, acc: 92.6228, loss_bbox: 0.2617, loss_mask: 0.2556, loss: 0.7988 2024-06-01 03:20:21,090 - mmdet - INFO - Epoch [4][4300/7330] lr: 1.000e-04, eta: 11:11:44, time: 0.709, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0476, loss_cls: 0.2043, acc: 92.6416, loss_bbox: 0.2586, loss_mask: 0.2660, loss: 0.8015 2024-06-01 03:20:54,022 - mmdet - INFO - Epoch [4][4350/7330] lr: 1.000e-04, eta: 11:11:12, time: 0.659, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0455, loss_cls: 0.2046, acc: 92.7334, loss_bbox: 0.2514, loss_mask: 0.2521, loss: 0.7761 2024-06-01 03:21:28,455 - mmdet - INFO - Epoch [4][4400/7330] lr: 1.000e-04, eta: 11:10:43, time: 0.689, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0464, loss_cls: 0.2064, acc: 92.7422, loss_bbox: 0.2525, loss_mask: 0.2507, loss: 0.7810 2024-06-01 03:22:05,621 - mmdet - INFO - Epoch [4][4450/7330] lr: 1.000e-04, eta: 11:10:21, time: 0.743, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0510, loss_cls: 0.2204, acc: 92.1628, loss_bbox: 0.2722, loss_mask: 0.2620, loss: 0.8319 2024-06-01 03:22:45,094 - mmdet - INFO - Epoch [4][4500/7330] lr: 1.000e-04, eta: 11:10:04, time: 0.789, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0452, loss_cls: 0.1976, acc: 92.8879, loss_bbox: 0.2518, loss_mask: 0.2496, loss: 0.7663 2024-06-01 03:23:17,216 - mmdet - INFO - Epoch [4][4550/7330] lr: 1.000e-04, eta: 11:09:30, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0455, loss_cls: 0.2024, acc: 92.7058, loss_bbox: 0.2510, loss_mask: 0.2581, loss: 0.7824 2024-06-01 03:23:49,306 - mmdet - INFO - Epoch [4][4600/7330] lr: 1.000e-04, eta: 11:08:56, time: 0.642, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0448, loss_cls: 0.2030, acc: 92.8679, loss_bbox: 0.2539, loss_mask: 0.2548, loss: 0.7784 2024-06-01 03:24:25,147 - mmdet - INFO - Epoch [4][4650/7330] lr: 1.000e-04, eta: 11:08:30, time: 0.717, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0474, loss_cls: 0.2108, acc: 92.4377, loss_bbox: 0.2619, loss_mask: 0.2583, loss: 0.8034 2024-06-01 03:24:57,466 - mmdet - INFO - Epoch [4][4700/7330] lr: 1.000e-04, eta: 11:07:57, time: 0.646, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0432, loss_cls: 0.1982, acc: 92.8870, loss_bbox: 0.2472, loss_mask: 0.2451, loss: 0.7558 2024-06-01 03:25:29,315 - mmdet - INFO - Epoch [4][4750/7330] lr: 1.000e-04, eta: 11:07:22, time: 0.637, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0466, loss_cls: 0.2021, acc: 92.9097, loss_bbox: 0.2526, loss_mask: 0.2576, loss: 0.7857 2024-06-01 03:26:01,704 - mmdet - INFO - Epoch [4][4800/7330] lr: 1.000e-04, eta: 11:06:49, time: 0.648, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0482, loss_cls: 0.2128, acc: 92.4253, loss_bbox: 0.2619, loss_mask: 0.2582, loss: 0.8060 2024-06-01 03:26:33,871 - mmdet - INFO - Epoch [4][4850/7330] lr: 1.000e-04, eta: 11:06:15, time: 0.643, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0460, loss_cls: 0.2045, acc: 92.6465, loss_bbox: 0.2570, loss_mask: 0.2523, loss: 0.7850 2024-06-01 03:27:05,926 - mmdet - INFO - Epoch [4][4900/7330] lr: 1.000e-04, eta: 11:05:41, time: 0.641, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0469, loss_cls: 0.2066, acc: 92.6245, loss_bbox: 0.2570, loss_mask: 0.2563, loss: 0.7923 2024-06-01 03:27:38,128 - mmdet - INFO - Epoch [4][4950/7330] lr: 1.000e-04, eta: 11:05:07, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0481, loss_cls: 0.2107, acc: 92.5913, loss_bbox: 0.2615, loss_mask: 0.2564, loss: 0.8019 2024-06-01 03:28:10,076 - mmdet - INFO - Epoch [4][5000/7330] lr: 1.000e-04, eta: 11:04:32, time: 0.639, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0442, loss_cls: 0.2015, acc: 92.7180, loss_bbox: 0.2532, loss_mask: 0.2582, loss: 0.7790 2024-06-01 03:28:41,961 - mmdet - INFO - Epoch [4][5050/7330] lr: 1.000e-04, eta: 11:03:58, time: 0.638, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0429, loss_cls: 0.1937, acc: 93.0098, loss_bbox: 0.2455, loss_mask: 0.2517, loss: 0.7574 2024-06-01 03:29:14,056 - mmdet - INFO - Epoch [4][5100/7330] lr: 1.000e-04, eta: 11:03:24, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0464, loss_cls: 0.2052, acc: 92.7043, loss_bbox: 0.2560, loss_mask: 0.2565, loss: 0.7891 2024-06-01 03:29:46,220 - mmdet - INFO - Epoch [4][5150/7330] lr: 1.000e-04, eta: 11:02:50, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0472, loss_cls: 0.2054, acc: 92.5151, loss_bbox: 0.2641, loss_mask: 0.2572, loss: 0.7980 2024-06-01 03:30:18,597 - mmdet - INFO - Epoch [4][5200/7330] lr: 1.000e-04, eta: 11:02:17, time: 0.648, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0499, loss_cls: 0.2157, acc: 92.4138, loss_bbox: 0.2645, loss_mask: 0.2657, loss: 0.8233 2024-06-01 03:30:51,308 - mmdet - INFO - Epoch [4][5250/7330] lr: 1.000e-04, eta: 11:01:44, time: 0.654, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0462, loss_cls: 0.1978, acc: 92.9092, loss_bbox: 0.2518, loss_mask: 0.2526, loss: 0.7751 2024-06-01 03:31:23,484 - mmdet - INFO - Epoch [4][5300/7330] lr: 1.000e-04, eta: 11:01:10, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0446, loss_cls: 0.2051, acc: 92.6494, loss_bbox: 0.2522, loss_mask: 0.2567, loss: 0.7834 2024-06-01 03:31:55,219 - mmdet - INFO - Epoch [4][5350/7330] lr: 1.000e-04, eta: 11:00:35, time: 0.635, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0460, loss_cls: 0.1952, acc: 92.9424, loss_bbox: 0.2521, loss_mask: 0.2560, loss: 0.7723 2024-06-01 03:32:30,796 - mmdet - INFO - Epoch [4][5400/7330] lr: 1.000e-04, eta: 11:00:09, time: 0.711, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0471, loss_cls: 0.2005, acc: 92.8225, loss_bbox: 0.2556, loss_mask: 0.2534, loss: 0.7817 2024-06-01 03:33:04,959 - mmdet - INFO - Epoch [4][5450/7330] lr: 1.000e-04, eta: 10:59:39, time: 0.683, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0464, loss_cls: 0.2041, acc: 92.6616, loss_bbox: 0.2541, loss_mask: 0.2566, loss: 0.7858 2024-06-01 03:33:38,968 - mmdet - INFO - Epoch [4][5500/7330] lr: 1.000e-04, eta: 10:59:10, time: 0.680, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0449, loss_cls: 0.2055, acc: 92.7024, loss_bbox: 0.2563, loss_mask: 0.2543, loss: 0.7849 2024-06-01 03:34:17,443 - mmdet - INFO - Epoch [4][5550/7330] lr: 1.000e-04, eta: 10:58:50, time: 0.769, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0460, loss_cls: 0.1993, acc: 92.8481, loss_bbox: 0.2545, loss_mask: 0.2566, loss: 0.7797 2024-06-01 03:34:51,796 - mmdet - INFO - Epoch [4][5600/7330] lr: 1.000e-04, eta: 10:58:20, time: 0.687, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0467, loss_cls: 0.2057, acc: 92.6165, loss_bbox: 0.2488, loss_mask: 0.2525, loss: 0.7782 2024-06-01 03:35:24,008 - mmdet - INFO - Epoch [4][5650/7330] lr: 1.000e-04, eta: 10:57:47, time: 0.644, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0463, loss_cls: 0.2063, acc: 92.6223, loss_bbox: 0.2572, loss_mask: 0.2521, loss: 0.7877 2024-06-01 03:35:58,277 - mmdet - INFO - Epoch [4][5700/7330] lr: 1.000e-04, eta: 10:57:17, time: 0.685, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0433, loss_cls: 0.1956, acc: 92.8745, loss_bbox: 0.2469, loss_mask: 0.2477, loss: 0.7580 2024-06-01 03:36:30,433 - mmdet - INFO - Epoch [4][5750/7330] lr: 1.000e-04, eta: 10:56:43, time: 0.643, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0476, loss_cls: 0.2138, acc: 92.2605, loss_bbox: 0.2698, loss_mask: 0.2593, loss: 0.8147 2024-06-01 03:37:02,415 - mmdet - INFO - Epoch [4][5800/7330] lr: 1.000e-04, eta: 10:56:09, time: 0.640, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0463, loss_cls: 0.1985, acc: 92.8889, loss_bbox: 0.2512, loss_mask: 0.2557, loss: 0.7752 2024-06-01 03:37:34,518 - mmdet - INFO - Epoch [4][5850/7330] lr: 1.000e-04, eta: 10:55:35, time: 0.642, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0452, loss_cls: 0.2008, acc: 92.8721, loss_bbox: 0.2510, loss_mask: 0.2499, loss: 0.7710 2024-06-01 03:38:07,330 - mmdet - INFO - Epoch [4][5900/7330] lr: 1.000e-04, eta: 10:55:02, time: 0.656, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0464, loss_cls: 0.2034, acc: 92.6982, loss_bbox: 0.2547, loss_mask: 0.2533, loss: 0.7815 2024-06-01 03:38:39,592 - mmdet - INFO - Epoch [4][5950/7330] lr: 1.000e-04, eta: 10:54:29, time: 0.645, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0474, loss_cls: 0.2044, acc: 92.6704, loss_bbox: 0.2550, loss_mask: 0.2576, loss: 0.7905 2024-06-01 03:39:11,514 - mmdet - INFO - Epoch [4][6000/7330] lr: 1.000e-04, eta: 10:53:54, time: 0.638, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0442, loss_cls: 0.1987, acc: 92.8584, loss_bbox: 0.2524, loss_mask: 0.2535, loss: 0.7718 2024-06-01 03:39:43,434 - mmdet - INFO - Epoch [4][6050/7330] lr: 1.000e-04, eta: 10:53:20, time: 0.638, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0442, loss_cls: 0.2027, acc: 92.7808, loss_bbox: 0.2517, loss_mask: 0.2522, loss: 0.7754 2024-06-01 03:40:15,550 - mmdet - INFO - Epoch [4][6100/7330] lr: 1.000e-04, eta: 10:52:46, time: 0.642, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0495, loss_cls: 0.2086, acc: 92.5930, loss_bbox: 0.2559, loss_mask: 0.2495, loss: 0.7897 2024-06-01 03:40:47,447 - mmdet - INFO - Epoch [4][6150/7330] lr: 1.000e-04, eta: 10:52:12, time: 0.638, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0446, loss_cls: 0.2024, acc: 92.8049, loss_bbox: 0.2498, loss_mask: 0.2474, loss: 0.7667 2024-06-01 03:41:19,440 - mmdet - INFO - Epoch [4][6200/7330] lr: 1.000e-04, eta: 10:51:37, time: 0.640, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0451, loss_cls: 0.2050, acc: 92.6521, loss_bbox: 0.2559, loss_mask: 0.2507, loss: 0.7821 2024-06-01 03:41:51,391 - mmdet - INFO - Epoch [4][6250/7330] lr: 1.000e-04, eta: 10:51:03, time: 0.639, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0495, loss_cls: 0.2064, acc: 92.5068, loss_bbox: 0.2623, loss_mask: 0.2558, loss: 0.7994 2024-06-01 03:42:23,604 - mmdet - INFO - Epoch [4][6300/7330] lr: 1.000e-04, eta: 10:50:29, time: 0.644, data_time: 0.045, memory: 13277, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0482, loss_cls: 0.2109, acc: 92.3665, loss_bbox: 0.2647, loss_mask: 0.2564, loss: 0.8059 2024-06-01 03:42:55,353 - mmdet - INFO - Epoch [4][6350/7330] lr: 1.000e-04, eta: 10:49:55, time: 0.635, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0446, loss_cls: 0.2098, acc: 92.5437, loss_bbox: 0.2584, loss_mask: 0.2616, loss: 0.7980 2024-06-01 03:43:27,544 - mmdet - INFO - Epoch [4][6400/7330] lr: 1.000e-04, eta: 10:49:21, time: 0.644, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0435, loss_cls: 0.1961, acc: 92.9521, loss_bbox: 0.2507, loss_mask: 0.2546, loss: 0.7684 2024-06-01 03:44:02,583 - mmdet - INFO - Epoch [4][6450/7330] lr: 1.000e-04, eta: 10:48:53, time: 0.701, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0463, loss_cls: 0.2069, acc: 92.5830, loss_bbox: 0.2600, loss_mask: 0.2611, loss: 0.8004 2024-06-01 03:44:34,967 - mmdet - INFO - Epoch [4][6500/7330] lr: 1.000e-04, eta: 10:48:20, time: 0.648, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0468, loss_cls: 0.2177, acc: 92.2683, loss_bbox: 0.2655, loss_mask: 0.2594, loss: 0.8158 2024-06-01 03:45:09,956 - mmdet - INFO - Epoch [4][6550/7330] lr: 1.000e-04, eta: 10:47:52, time: 0.700, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0462, loss_cls: 0.1977, acc: 92.8743, loss_bbox: 0.2554, loss_mask: 0.2468, loss: 0.7699 2024-06-01 03:45:48,842 - mmdet - INFO - Epoch [4][6600/7330] lr: 1.000e-04, eta: 10:47:32, time: 0.778, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0489, loss_cls: 0.2121, acc: 92.4685, loss_bbox: 0.2614, loss_mask: 0.2511, loss: 0.7997 2024-06-01 03:46:25,986 - mmdet - INFO - Epoch [4][6650/7330] lr: 1.000e-04, eta: 10:47:08, time: 0.743, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0467, loss_cls: 0.2103, acc: 92.4651, loss_bbox: 0.2628, loss_mask: 0.2592, loss: 0.8040 2024-06-01 03:46:58,234 - mmdet - INFO - Epoch [4][6700/7330] lr: 1.000e-04, eta: 10:46:34, time: 0.645, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0416, loss_cls: 0.1935, acc: 93.0771, loss_bbox: 0.2435, loss_mask: 0.2374, loss: 0.7381 2024-06-01 03:47:32,328 - mmdet - INFO - Epoch [4][6750/7330] lr: 1.000e-04, eta: 10:46:05, time: 0.682, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0457, loss_cls: 0.2033, acc: 92.7725, loss_bbox: 0.2525, loss_mask: 0.2518, loss: 0.7758 2024-06-01 03:48:04,575 - mmdet - INFO - Epoch [4][6800/7330] lr: 1.000e-04, eta: 10:45:31, time: 0.645, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0478, loss_cls: 0.2015, acc: 92.7449, loss_bbox: 0.2550, loss_mask: 0.2514, loss: 0.7799 2024-06-01 03:48:36,907 - mmdet - INFO - Epoch [4][6850/7330] lr: 1.000e-04, eta: 10:44:57, time: 0.647, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0472, loss_cls: 0.2065, acc: 92.5928, loss_bbox: 0.2595, loss_mask: 0.2570, loss: 0.7954 2024-06-01 03:49:08,844 - mmdet - INFO - Epoch [4][6900/7330] lr: 1.000e-04, eta: 10:44:23, time: 0.639, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0460, loss_cls: 0.2055, acc: 92.5371, loss_bbox: 0.2578, loss_mask: 0.2569, loss: 0.7911 2024-06-01 03:49:40,817 - mmdet - INFO - Epoch [4][6950/7330] lr: 1.000e-04, eta: 10:43:49, time: 0.639, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0485, loss_cls: 0.2072, acc: 92.6021, loss_bbox: 0.2587, loss_mask: 0.2591, loss: 0.7995 2024-06-01 03:50:13,125 - mmdet - INFO - Epoch [4][7000/7330] lr: 1.000e-04, eta: 10:43:15, time: 0.646, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0478, loss_cls: 0.2039, acc: 92.6602, loss_bbox: 0.2560, loss_mask: 0.2537, loss: 0.7887 2024-06-01 03:50:44,984 - mmdet - INFO - Epoch [4][7050/7330] lr: 1.000e-04, eta: 10:42:41, time: 0.637, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0433, loss_cls: 0.1991, acc: 93.0369, loss_bbox: 0.2503, loss_mask: 0.2474, loss: 0.7624 2024-06-01 03:51:16,889 - mmdet - INFO - Epoch [4][7100/7330] lr: 1.000e-04, eta: 10:42:06, time: 0.638, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0436, loss_cls: 0.2063, acc: 92.7202, loss_bbox: 0.2521, loss_mask: 0.2472, loss: 0.7718 2024-06-01 03:51:48,777 - mmdet - INFO - Epoch [4][7150/7330] lr: 1.000e-04, eta: 10:41:32, time: 0.638, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0469, loss_cls: 0.2099, acc: 92.4458, loss_bbox: 0.2596, loss_mask: 0.2564, loss: 0.7985 2024-06-01 03:52:20,938 - mmdet - INFO - Epoch [4][7200/7330] lr: 1.000e-04, eta: 10:40:58, time: 0.643, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0452, loss_cls: 0.2058, acc: 92.6472, loss_bbox: 0.2527, loss_mask: 0.2499, loss: 0.7784 2024-06-01 03:52:52,915 - mmdet - INFO - Epoch [4][7250/7330] lr: 1.000e-04, eta: 10:40:24, time: 0.640, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0461, loss_cls: 0.2090, acc: 92.5698, loss_bbox: 0.2598, loss_mask: 0.2523, loss: 0.7927 2024-06-01 03:53:25,280 - mmdet - INFO - Epoch [4][7300/7330] lr: 1.000e-04, eta: 10:39:50, time: 0.647, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0500, loss_cls: 0.2160, acc: 92.2202, loss_bbox: 0.2727, loss_mask: 0.2638, loss: 0.8299 2024-06-01 03:53:45,095 - mmdet - INFO - Saving checkpoint at 4 epochs 2024-06-01 03:55:26,955 - mmdet - INFO - Evaluating bbox... 2024-06-01 03:55:52,659 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.425 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.668 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.466 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.245 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.470 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.582 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.341 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.595 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.717 2024-06-01 03:55:52,659 - mmdet - INFO - Evaluating segm... 2024-06-01 03:56:20,295 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.383 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.627 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.403 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.164 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.420 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.607 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.497 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.497 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.497 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.266 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.550 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.704 2024-06-01 03:56:20,928 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 03:56:20,931 - mmdet - INFO - Epoch(val) [4][625] bbox_mAP: 0.4250, bbox_mAP_50: 0.6680, bbox_mAP_75: 0.4660, bbox_mAP_s: 0.2450, bbox_mAP_m: 0.4700, bbox_mAP_l: 0.5820, bbox_mAP_copypaste: 0.425 0.668 0.466 0.245 0.470 0.582, segm_mAP: 0.3830, segm_mAP_50: 0.6270, segm_mAP_75: 0.4030, segm_mAP_s: 0.1640, segm_mAP_m: 0.4200, segm_mAP_l: 0.6070, segm_mAP_copypaste: 0.383 0.627 0.403 0.164 0.420 0.607 2024-06-01 03:57:06,751 - mmdet - INFO - Epoch [5][50/7330] lr: 1.000e-04, eta: 10:38:45, time: 0.916, data_time: 0.102, memory: 13277, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0458, loss_cls: 0.1983, acc: 92.7319, loss_bbox: 0.2523, loss_mask: 0.2491, loss: 0.7676 2024-06-01 03:57:38,818 - mmdet - INFO - Epoch [5][100/7330] lr: 1.000e-04, eta: 10:38:11, time: 0.641, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0436, loss_cls: 0.1917, acc: 93.0464, loss_bbox: 0.2461, loss_mask: 0.2488, loss: 0.7514 2024-06-01 03:58:16,059 - mmdet - INFO - Epoch [5][150/7330] lr: 1.000e-04, eta: 10:37:47, time: 0.745, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0439, loss_cls: 0.1917, acc: 93.0977, loss_bbox: 0.2451, loss_mask: 0.2473, loss: 0.7509 2024-06-01 03:58:50,160 - mmdet - INFO - Epoch [5][200/7330] lr: 1.000e-04, eta: 10:37:17, time: 0.682, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0422, loss_cls: 0.1960, acc: 92.9995, loss_bbox: 0.2440, loss_mask: 0.2505, loss: 0.7548 2024-06-01 03:59:22,327 - mmdet - INFO - Epoch [5][250/7330] lr: 1.000e-04, eta: 10:36:44, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0479, loss_cls: 0.1985, acc: 92.7585, loss_bbox: 0.2547, loss_mask: 0.2569, loss: 0.7818 2024-06-01 03:59:54,810 - mmdet - INFO - Epoch [5][300/7330] lr: 1.000e-04, eta: 10:36:10, time: 0.650, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0443, loss_cls: 0.1955, acc: 92.8269, loss_bbox: 0.2489, loss_mask: 0.2465, loss: 0.7568 2024-06-01 04:00:26,997 - mmdet - INFO - Epoch [5][350/7330] lr: 1.000e-04, eta: 10:35:37, time: 0.644, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0457, loss_cls: 0.1993, acc: 92.6697, loss_bbox: 0.2543, loss_mask: 0.2477, loss: 0.7689 2024-06-01 04:00:58,980 - mmdet - INFO - Epoch [5][400/7330] lr: 1.000e-04, eta: 10:35:02, time: 0.640, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0451, loss_cls: 0.2038, acc: 92.6145, loss_bbox: 0.2562, loss_mask: 0.2498, loss: 0.7755 2024-06-01 04:01:33,299 - mmdet - INFO - Epoch [5][450/7330] lr: 1.000e-04, eta: 10:34:33, time: 0.686, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0439, loss_cls: 0.1917, acc: 93.1414, loss_bbox: 0.2432, loss_mask: 0.2508, loss: 0.7515 2024-06-01 04:02:05,602 - mmdet - INFO - Epoch [5][500/7330] lr: 1.000e-04, eta: 10:33:59, time: 0.646, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0462, loss_cls: 0.1917, acc: 92.9241, loss_bbox: 0.2517, loss_mask: 0.2491, loss: 0.7616 2024-06-01 04:02:37,681 - mmdet - INFO - Epoch [5][550/7330] lr: 1.000e-04, eta: 10:33:25, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0448, loss_cls: 0.1930, acc: 92.9053, loss_bbox: 0.2465, loss_mask: 0.2423, loss: 0.7472 2024-06-01 04:03:09,878 - mmdet - INFO - Epoch [5][600/7330] lr: 1.000e-04, eta: 10:32:52, time: 0.644, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0461, loss_cls: 0.1987, acc: 92.7676, loss_bbox: 0.2538, loss_mask: 0.2557, loss: 0.7787 2024-06-01 04:03:41,942 - mmdet - INFO - Epoch [5][650/7330] lr: 1.000e-04, eta: 10:32:18, time: 0.641, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0448, loss_cls: 0.1976, acc: 92.9001, loss_bbox: 0.2522, loss_mask: 0.2504, loss: 0.7694 2024-06-01 04:04:13,972 - mmdet - INFO - Epoch [5][700/7330] lr: 1.000e-04, eta: 10:31:44, time: 0.641, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0438, loss_cls: 0.2025, acc: 92.7444, loss_bbox: 0.2569, loss_mask: 0.2554, loss: 0.7808 2024-06-01 04:04:45,942 - mmdet - INFO - Epoch [5][750/7330] lr: 1.000e-04, eta: 10:31:10, time: 0.639, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0441, loss_cls: 0.1915, acc: 92.8787, loss_bbox: 0.2510, loss_mask: 0.2488, loss: 0.7587 2024-06-01 04:05:18,076 - mmdet - INFO - Epoch [5][800/7330] lr: 1.000e-04, eta: 10:30:36, time: 0.643, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0430, loss_cls: 0.1943, acc: 92.9819, loss_bbox: 0.2475, loss_mask: 0.2442, loss: 0.7517 2024-06-01 04:05:50,331 - mmdet - INFO - Epoch [5][850/7330] lr: 1.000e-04, eta: 10:30:02, time: 0.645, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0458, loss_cls: 0.1977, acc: 92.8040, loss_bbox: 0.2522, loss_mask: 0.2521, loss: 0.7709 2024-06-01 04:06:22,226 - mmdet - INFO - Epoch [5][900/7330] lr: 1.000e-04, eta: 10:29:28, time: 0.638, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0457, loss_cls: 0.1927, acc: 92.8665, loss_bbox: 0.2531, loss_mask: 0.2504, loss: 0.7645 2024-06-01 04:06:54,161 - mmdet - INFO - Epoch [5][950/7330] lr: 1.000e-04, eta: 10:28:54, time: 0.639, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0457, loss_cls: 0.1933, acc: 92.9463, loss_bbox: 0.2464, loss_mask: 0.2431, loss: 0.7514 2024-06-01 04:07:25,919 - mmdet - INFO - Epoch [5][1000/7330] lr: 1.000e-04, eta: 10:28:19, time: 0.635, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0434, loss_cls: 0.1910, acc: 93.1748, loss_bbox: 0.2422, loss_mask: 0.2417, loss: 0.7403 2024-06-01 04:07:57,918 - mmdet - INFO - Epoch [5][1050/7330] lr: 1.000e-04, eta: 10:27:45, time: 0.640, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0453, loss_cls: 0.2044, acc: 92.5117, loss_bbox: 0.2634, loss_mask: 0.2531, loss: 0.7884 2024-06-01 04:08:32,777 - mmdet - INFO - Epoch [5][1100/7330] lr: 1.000e-04, eta: 10:27:17, time: 0.697, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0464, loss_cls: 0.2012, acc: 92.7329, loss_bbox: 0.2538, loss_mask: 0.2557, loss: 0.7788 2024-06-01 04:09:10,334 - mmdet - INFO - Epoch [5][1150/7330] lr: 1.000e-04, eta: 10:26:53, time: 0.751, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0437, loss_cls: 0.1908, acc: 92.9524, loss_bbox: 0.2504, loss_mask: 0.2523, loss: 0.7597 2024-06-01 04:09:45,034 - mmdet - INFO - Epoch [5][1200/7330] lr: 1.000e-04, eta: 10:26:24, time: 0.694, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0431, loss_cls: 0.1927, acc: 92.9253, loss_bbox: 0.2432, loss_mask: 0.2472, loss: 0.7496 2024-06-01 04:10:21,569 - mmdet - INFO - Epoch [5][1250/7330] lr: 1.000e-04, eta: 10:25:58, time: 0.731, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0433, loss_cls: 0.1974, acc: 92.9397, loss_bbox: 0.2491, loss_mask: 0.2473, loss: 0.7594 2024-06-01 04:10:54,360 - mmdet - INFO - Epoch [5][1300/7330] lr: 1.000e-04, eta: 10:25:26, time: 0.656, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0481, loss_cls: 0.2024, acc: 92.5842, loss_bbox: 0.2616, loss_mask: 0.2547, loss: 0.7923 2024-06-01 04:11:26,402 - mmdet - INFO - Epoch [5][1350/7330] lr: 1.000e-04, eta: 10:24:52, time: 0.641, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0462, loss_cls: 0.1959, acc: 92.7900, loss_bbox: 0.2541, loss_mask: 0.2510, loss: 0.7702 2024-06-01 04:11:58,579 - mmdet - INFO - Epoch [5][1400/7330] lr: 1.000e-04, eta: 10:24:18, time: 0.644, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0439, loss_cls: 0.1920, acc: 92.9612, loss_bbox: 0.2482, loss_mask: 0.2428, loss: 0.7488 2024-06-01 04:12:34,366 - mmdet - INFO - Epoch [5][1450/7330] lr: 1.000e-04, eta: 10:23:51, time: 0.716, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0438, loss_cls: 0.1928, acc: 93.0486, loss_bbox: 0.2476, loss_mask: 0.2460, loss: 0.7530 2024-06-01 04:13:09,239 - mmdet - INFO - Epoch [5][1500/7330] lr: 1.000e-04, eta: 10:23:22, time: 0.698, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0462, loss_cls: 0.1992, acc: 92.7166, loss_bbox: 0.2582, loss_mask: 0.2527, loss: 0.7780 2024-06-01 04:13:41,788 - mmdet - INFO - Epoch [5][1550/7330] lr: 1.000e-04, eta: 10:22:49, time: 0.651, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0507, loss_cls: 0.2117, acc: 92.3181, loss_bbox: 0.2700, loss_mask: 0.2617, loss: 0.8206 2024-06-01 04:14:13,872 - mmdet - INFO - Epoch [5][1600/7330] lr: 1.000e-04, eta: 10:22:15, time: 0.642, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0448, loss_cls: 0.1931, acc: 92.9624, loss_bbox: 0.2476, loss_mask: 0.2476, loss: 0.7557 2024-06-01 04:14:45,942 - mmdet - INFO - Epoch [5][1650/7330] lr: 1.000e-04, eta: 10:21:41, time: 0.641, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0454, loss_cls: 0.1976, acc: 92.9360, loss_bbox: 0.2511, loss_mask: 0.2455, loss: 0.7634 2024-06-01 04:15:18,116 - mmdet - INFO - Epoch [5][1700/7330] lr: 1.000e-04, eta: 10:21:08, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0406, loss_cls: 0.1964, acc: 92.9197, loss_bbox: 0.2510, loss_mask: 0.2500, loss: 0.7598 2024-06-01 04:15:50,185 - mmdet - INFO - Epoch [5][1750/7330] lr: 1.000e-04, eta: 10:20:34, time: 0.641, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0419, loss_cls: 0.1793, acc: 93.4138, loss_bbox: 0.2344, loss_mask: 0.2409, loss: 0.7175 2024-06-01 04:16:22,682 - mmdet - INFO - Epoch [5][1800/7330] lr: 1.000e-04, eta: 10:20:01, time: 0.650, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0449, loss_cls: 0.1937, acc: 92.9019, loss_bbox: 0.2494, loss_mask: 0.2411, loss: 0.7525 2024-06-01 04:16:54,881 - mmdet - INFO - Epoch [5][1850/7330] lr: 1.000e-04, eta: 10:19:27, time: 0.644, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0462, loss_cls: 0.1945, acc: 92.9719, loss_bbox: 0.2488, loss_mask: 0.2476, loss: 0.7592 2024-06-01 04:17:26,827 - mmdet - INFO - Epoch [5][1900/7330] lr: 1.000e-04, eta: 10:18:53, time: 0.639, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0442, loss_cls: 0.1867, acc: 93.2256, loss_bbox: 0.2392, loss_mask: 0.2454, loss: 0.7370 2024-06-01 04:17:59,034 - mmdet - INFO - Epoch [5][1950/7330] lr: 1.000e-04, eta: 10:18:19, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0466, loss_cls: 0.1987, acc: 92.8818, loss_bbox: 0.2565, loss_mask: 0.2502, loss: 0.7762 2024-06-01 04:18:31,619 - mmdet - INFO - Epoch [5][2000/7330] lr: 1.000e-04, eta: 10:17:46, time: 0.652, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0465, loss_cls: 0.2018, acc: 92.6653, loss_bbox: 0.2623, loss_mask: 0.2511, loss: 0.7856 2024-06-01 04:19:04,242 - mmdet - INFO - Epoch [5][2050/7330] lr: 1.000e-04, eta: 10:17:13, time: 0.652, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0442, loss_cls: 0.1963, acc: 92.8328, loss_bbox: 0.2544, loss_mask: 0.2487, loss: 0.7652 2024-06-01 04:19:36,717 - mmdet - INFO - Epoch [5][2100/7330] lr: 1.000e-04, eta: 10:16:40, time: 0.650, data_time: 0.023, memory: 13277, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0446, loss_cls: 0.1991, acc: 92.8091, loss_bbox: 0.2525, loss_mask: 0.2514, loss: 0.7711 2024-06-01 04:20:08,593 - mmdet - INFO - Epoch [5][2150/7330] lr: 1.000e-04, eta: 10:16:06, time: 0.637, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0440, loss_cls: 0.1934, acc: 92.9302, loss_bbox: 0.2496, loss_mask: 0.2518, loss: 0.7616 2024-06-01 04:20:46,041 - mmdet - INFO - Epoch [5][2200/7330] lr: 1.000e-04, eta: 10:15:42, time: 0.749, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0466, loss_cls: 0.2025, acc: 92.7019, loss_bbox: 0.2527, loss_mask: 0.2477, loss: 0.7710 2024-06-01 04:21:25,606 - mmdet - INFO - Epoch [5][2250/7330] lr: 1.000e-04, eta: 10:15:21, time: 0.791, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0457, loss_cls: 0.1960, acc: 92.8508, loss_bbox: 0.2511, loss_mask: 0.2494, loss: 0.7664 2024-06-01 04:22:02,670 - mmdet - INFO - Epoch [5][2300/7330] lr: 1.000e-04, eta: 10:14:56, time: 0.741, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0447, loss_cls: 0.1982, acc: 92.7722, loss_bbox: 0.2490, loss_mask: 0.2436, loss: 0.7588 2024-06-01 04:22:35,210 - mmdet - INFO - Epoch [5][2350/7330] lr: 1.000e-04, eta: 10:14:23, time: 0.651, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0443, loss_cls: 0.1946, acc: 92.9497, loss_bbox: 0.2519, loss_mask: 0.2481, loss: 0.7613 2024-06-01 04:23:07,726 - mmdet - INFO - Epoch [5][2400/7330] lr: 1.000e-04, eta: 10:13:50, time: 0.650, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0490, loss_cls: 0.2036, acc: 92.6211, loss_bbox: 0.2608, loss_mask: 0.2494, loss: 0.7874 2024-06-01 04:23:39,951 - mmdet - INFO - Epoch [5][2450/7330] lr: 1.000e-04, eta: 10:13:16, time: 0.645, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0431, loss_cls: 0.1895, acc: 93.0181, loss_bbox: 0.2435, loss_mask: 0.2465, loss: 0.7449 2024-06-01 04:24:15,342 - mmdet - INFO - Epoch [5][2500/7330] lr: 1.000e-04, eta: 10:12:48, time: 0.708, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0462, loss_cls: 0.2034, acc: 92.5425, loss_bbox: 0.2547, loss_mask: 0.2492, loss: 0.7778 2024-06-01 04:24:47,659 - mmdet - INFO - Epoch [5][2550/7330] lr: 1.000e-04, eta: 10:12:15, time: 0.646, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0476, loss_cls: 0.2107, acc: 92.4226, loss_bbox: 0.2617, loss_mask: 0.2555, loss: 0.7997 2024-06-01 04:25:21,999 - mmdet - INFO - Epoch [5][2600/7330] lr: 1.000e-04, eta: 10:11:45, time: 0.687, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0450, loss_cls: 0.1965, acc: 92.8892, loss_bbox: 0.2475, loss_mask: 0.2409, loss: 0.7521 2024-06-01 04:25:54,314 - mmdet - INFO - Epoch [5][2650/7330] lr: 1.000e-04, eta: 10:11:11, time: 0.646, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0452, loss_cls: 0.2052, acc: 92.6018, loss_bbox: 0.2576, loss_mask: 0.2501, loss: 0.7834 2024-06-01 04:26:26,475 - mmdet - INFO - Epoch [5][2700/7330] lr: 1.000e-04, eta: 10:10:37, time: 0.643, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0435, loss_cls: 0.1918, acc: 93.0649, loss_bbox: 0.2439, loss_mask: 0.2390, loss: 0.7393 2024-06-01 04:26:58,186 - mmdet - INFO - Epoch [5][2750/7330] lr: 1.000e-04, eta: 10:10:03, time: 0.634, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0415, loss_cls: 0.1855, acc: 93.1655, loss_bbox: 0.2379, loss_mask: 0.2419, loss: 0.7282 2024-06-01 04:27:30,373 - mmdet - INFO - Epoch [5][2800/7330] lr: 1.000e-04, eta: 10:09:29, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0469, loss_cls: 0.1968, acc: 92.9971, loss_bbox: 0.2480, loss_mask: 0.2496, loss: 0.7649 2024-06-01 04:28:02,581 - mmdet - INFO - Epoch [5][2850/7330] lr: 1.000e-04, eta: 10:08:55, time: 0.644, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0438, loss_cls: 0.1965, acc: 92.7432, loss_bbox: 0.2510, loss_mask: 0.2459, loss: 0.7584 2024-06-01 04:28:34,940 - mmdet - INFO - Epoch [5][2900/7330] lr: 1.000e-04, eta: 10:08:22, time: 0.648, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0461, loss_cls: 0.1967, acc: 92.8940, loss_bbox: 0.2493, loss_mask: 0.2440, loss: 0.7590 2024-06-01 04:29:07,152 - mmdet - INFO - Epoch [5][2950/7330] lr: 1.000e-04, eta: 10:07:48, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0469, loss_cls: 0.2002, acc: 92.7461, loss_bbox: 0.2552, loss_mask: 0.2509, loss: 0.7771 2024-06-01 04:29:39,369 - mmdet - INFO - Epoch [5][3000/7330] lr: 1.000e-04, eta: 10:07:15, time: 0.644, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0424, loss_cls: 0.1938, acc: 93.0344, loss_bbox: 0.2449, loss_mask: 0.2464, loss: 0.7499 2024-06-01 04:30:11,391 - mmdet - INFO - Epoch [5][3050/7330] lr: 1.000e-04, eta: 10:06:41, time: 0.640, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0456, loss_cls: 0.1965, acc: 92.8855, loss_bbox: 0.2468, loss_mask: 0.2433, loss: 0.7542 2024-06-01 04:30:43,466 - mmdet - INFO - Epoch [5][3100/7330] lr: 1.000e-04, eta: 10:06:07, time: 0.642, data_time: 0.042, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0456, loss_cls: 0.1952, acc: 92.9558, loss_bbox: 0.2462, loss_mask: 0.2512, loss: 0.7607 2024-06-01 04:31:15,369 - mmdet - INFO - Epoch [5][3150/7330] lr: 1.000e-04, eta: 10:05:33, time: 0.638, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0456, loss_cls: 0.1982, acc: 92.6963, loss_bbox: 0.2575, loss_mask: 0.2482, loss: 0.7721 2024-06-01 04:31:47,451 - mmdet - INFO - Epoch [5][3200/7330] lr: 1.000e-04, eta: 10:04:59, time: 0.642, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0447, loss_cls: 0.1904, acc: 93.1323, loss_bbox: 0.2453, loss_mask: 0.2468, loss: 0.7498 2024-06-01 04:32:21,833 - mmdet - INFO - Epoch [5][3250/7330] lr: 1.000e-04, eta: 10:04:29, time: 0.688, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0447, loss_cls: 0.1913, acc: 92.9756, loss_bbox: 0.2450, loss_mask: 0.2439, loss: 0.7480 2024-06-01 04:32:56,345 - mmdet - INFO - Epoch [5][3300/7330] lr: 1.000e-04, eta: 10:03:59, time: 0.690, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0455, loss_cls: 0.1987, acc: 92.8342, loss_bbox: 0.2536, loss_mask: 0.2490, loss: 0.7691 2024-06-01 04:33:37,072 - mmdet - INFO - Epoch [5][3350/7330] lr: 1.000e-04, eta: 10:03:40, time: 0.814, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0467, loss_cls: 0.2036, acc: 92.7307, loss_bbox: 0.2487, loss_mask: 0.2461, loss: 0.7695 2024-06-01 04:34:09,407 - mmdet - INFO - Epoch [5][3400/7330] lr: 1.000e-04, eta: 10:03:06, time: 0.647, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0432, loss_cls: 0.1952, acc: 92.9451, loss_bbox: 0.2450, loss_mask: 0.2409, loss: 0.7447 2024-06-01 04:34:41,327 - mmdet - INFO - Epoch [5][3450/7330] lr: 1.000e-04, eta: 10:02:32, time: 0.639, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0412, loss_cls: 0.1908, acc: 93.1157, loss_bbox: 0.2403, loss_mask: 0.2395, loss: 0.7328 2024-06-01 04:35:13,296 - mmdet - INFO - Epoch [5][3500/7330] lr: 1.000e-04, eta: 10:01:58, time: 0.639, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0417, loss_cls: 0.1888, acc: 93.0652, loss_bbox: 0.2419, loss_mask: 0.2464, loss: 0.7400 2024-06-01 04:35:47,359 - mmdet - INFO - Epoch [5][3550/7330] lr: 1.000e-04, eta: 10:01:28, time: 0.681, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0441, loss_cls: 0.1952, acc: 92.9214, loss_bbox: 0.2465, loss_mask: 0.2485, loss: 0.7565 2024-06-01 04:36:19,457 - mmdet - INFO - Epoch [5][3600/7330] lr: 1.000e-04, eta: 10:00:54, time: 0.642, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0443, loss_cls: 0.1992, acc: 92.7510, loss_bbox: 0.2498, loss_mask: 0.2484, loss: 0.7637 2024-06-01 04:36:53,874 - mmdet - INFO - Epoch [5][3650/7330] lr: 1.000e-04, eta: 10:00:24, time: 0.688, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0440, loss_cls: 0.1901, acc: 93.1211, loss_bbox: 0.2376, loss_mask: 0.2451, loss: 0.7382 2024-06-01 04:37:26,112 - mmdet - INFO - Epoch [5][3700/7330] lr: 1.000e-04, eta: 9:59:50, time: 0.645, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0482, loss_cls: 0.2043, acc: 92.5623, loss_bbox: 0.2602, loss_mask: 0.2513, loss: 0.7873 2024-06-01 04:37:58,075 - mmdet - INFO - Epoch [5][3750/7330] lr: 1.000e-04, eta: 9:59:16, time: 0.639, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0434, loss_cls: 0.1914, acc: 93.0662, loss_bbox: 0.2473, loss_mask: 0.2450, loss: 0.7512 2024-06-01 04:38:30,512 - mmdet - INFO - Epoch [5][3800/7330] lr: 1.000e-04, eta: 9:58:43, time: 0.649, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0467, loss_cls: 0.1990, acc: 92.6003, loss_bbox: 0.2538, loss_mask: 0.2542, loss: 0.7755 2024-06-01 04:39:02,495 - mmdet - INFO - Epoch [5][3850/7330] lr: 1.000e-04, eta: 9:58:09, time: 0.640, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0474, loss_cls: 0.1932, acc: 92.9844, loss_bbox: 0.2434, loss_mask: 0.2500, loss: 0.7579 2024-06-01 04:39:34,888 - mmdet - INFO - Epoch [5][3900/7330] lr: 1.000e-04, eta: 9:57:36, time: 0.648, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0460, loss_cls: 0.2027, acc: 92.6257, loss_bbox: 0.2580, loss_mask: 0.2545, loss: 0.7838 2024-06-01 04:40:06,975 - mmdet - INFO - Epoch [5][3950/7330] lr: 1.000e-04, eta: 9:57:02, time: 0.642, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0432, loss_cls: 0.1966, acc: 92.9353, loss_bbox: 0.2547, loss_mask: 0.2482, loss: 0.7660 2024-06-01 04:40:39,173 - mmdet - INFO - Epoch [5][4000/7330] lr: 1.000e-04, eta: 9:56:28, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0485, loss_cls: 0.2017, acc: 92.7515, loss_bbox: 0.2544, loss_mask: 0.2504, loss: 0.7805 2024-06-01 04:41:11,362 - mmdet - INFO - Epoch [5][4050/7330] lr: 1.000e-04, eta: 9:55:54, time: 0.644, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0404, loss_cls: 0.1808, acc: 93.5117, loss_bbox: 0.2275, loss_mask: 0.2323, loss: 0.7002 2024-06-01 04:41:43,593 - mmdet - INFO - Epoch [5][4100/7330] lr: 1.000e-04, eta: 9:55:21, time: 0.645, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0485, loss_cls: 0.2050, acc: 92.5544, loss_bbox: 0.2588, loss_mask: 0.2485, loss: 0.7832 2024-06-01 04:42:16,146 - mmdet - INFO - Epoch [5][4150/7330] lr: 1.000e-04, eta: 9:54:48, time: 0.651, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0427, loss_cls: 0.1890, acc: 93.0962, loss_bbox: 0.2408, loss_mask: 0.2393, loss: 0.7340 2024-06-01 04:42:48,152 - mmdet - INFO - Epoch [5][4200/7330] lr: 1.000e-04, eta: 9:54:14, time: 0.640, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0422, loss_cls: 0.1954, acc: 92.9583, loss_bbox: 0.2435, loss_mask: 0.2423, loss: 0.7439 2024-06-01 04:43:19,950 - mmdet - INFO - Epoch [5][4250/7330] lr: 1.000e-04, eta: 9:53:39, time: 0.636, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0426, loss_cls: 0.2006, acc: 92.8337, loss_bbox: 0.2524, loss_mask: 0.2507, loss: 0.7687 2024-06-01 04:43:56,513 - mmdet - INFO - Epoch [5][4300/7330] lr: 1.000e-04, eta: 9:53:13, time: 0.731, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0420, loss_cls: 0.1953, acc: 92.8523, loss_bbox: 0.2463, loss_mask: 0.2440, loss: 0.7490 2024-06-01 04:44:30,552 - mmdet - INFO - Epoch [5][4350/7330] lr: 1.000e-04, eta: 9:52:42, time: 0.681, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0408, loss_cls: 0.1874, acc: 93.1799, loss_bbox: 0.2363, loss_mask: 0.2400, loss: 0.7253 2024-06-01 04:45:10,119 - mmdet - INFO - Epoch [5][4400/7330] lr: 1.000e-04, eta: 9:52:20, time: 0.791, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0442, loss_cls: 0.1925, acc: 93.0183, loss_bbox: 0.2457, loss_mask: 0.2468, loss: 0.7507 2024-06-01 04:45:42,414 - mmdet - INFO - Epoch [5][4450/7330] lr: 1.000e-04, eta: 9:51:47, time: 0.646, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0452, loss_cls: 0.1887, acc: 92.9802, loss_bbox: 0.2454, loss_mask: 0.2515, loss: 0.7543 2024-06-01 04:46:14,621 - mmdet - INFO - Epoch [5][4500/7330] lr: 1.000e-04, eta: 9:51:13, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0419, loss_cls: 0.1896, acc: 93.0471, loss_bbox: 0.2352, loss_mask: 0.2399, loss: 0.7262 2024-06-01 04:46:46,581 - mmdet - INFO - Epoch [5][4550/7330] lr: 1.000e-04, eta: 9:50:39, time: 0.639, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0443, loss_cls: 0.2002, acc: 92.7292, loss_bbox: 0.2514, loss_mask: 0.2433, loss: 0.7619 2024-06-01 04:47:20,774 - mmdet - INFO - Epoch [5][4600/7330] lr: 1.000e-04, eta: 9:50:09, time: 0.684, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0443, loss_cls: 0.1956, acc: 92.9695, loss_bbox: 0.2465, loss_mask: 0.2479, loss: 0.7555 2024-06-01 04:47:52,661 - mmdet - INFO - Epoch [5][4650/7330] lr: 1.000e-04, eta: 9:49:35, time: 0.638, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0433, loss_cls: 0.1896, acc: 93.0698, loss_bbox: 0.2436, loss_mask: 0.2480, loss: 0.7459 2024-06-01 04:48:26,917 - mmdet - INFO - Epoch [5][4700/7330] lr: 1.000e-04, eta: 9:49:04, time: 0.685, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0433, loss_cls: 0.1851, acc: 93.2419, loss_bbox: 0.2332, loss_mask: 0.2337, loss: 0.7166 2024-06-01 04:48:58,880 - mmdet - INFO - Epoch [5][4750/7330] lr: 1.000e-04, eta: 9:48:30, time: 0.639, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0443, loss_cls: 0.1912, acc: 93.0967, loss_bbox: 0.2447, loss_mask: 0.2491, loss: 0.7510 2024-06-01 04:49:30,900 - mmdet - INFO - Epoch [5][4800/7330] lr: 1.000e-04, eta: 9:47:56, time: 0.640, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0465, loss_cls: 0.2044, acc: 92.6714, loss_bbox: 0.2587, loss_mask: 0.2528, loss: 0.7860 2024-06-01 04:50:02,937 - mmdet - INFO - Epoch [5][4850/7330] lr: 1.000e-04, eta: 9:47:22, time: 0.641, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0410, loss_cls: 0.1842, acc: 93.3774, loss_bbox: 0.2358, loss_mask: 0.2349, loss: 0.7174 2024-06-01 04:50:34,566 - mmdet - INFO - Epoch [5][4900/7330] lr: 1.000e-04, eta: 9:46:48, time: 0.633, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0418, loss_cls: 0.1852, acc: 93.3313, loss_bbox: 0.2314, loss_mask: 0.2389, loss: 0.7206 2024-06-01 04:51:06,607 - mmdet - INFO - Epoch [5][4950/7330] lr: 1.000e-04, eta: 9:46:14, time: 0.641, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0449, loss_cls: 0.1979, acc: 92.7285, loss_bbox: 0.2519, loss_mask: 0.2509, loss: 0.7694 2024-06-01 04:51:38,318 - mmdet - INFO - Epoch [5][5000/7330] lr: 1.000e-04, eta: 9:45:40, time: 0.634, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0416, loss_cls: 0.1900, acc: 93.1738, loss_bbox: 0.2392, loss_mask: 0.2409, loss: 0.7325 2024-06-01 04:52:10,495 - mmdet - INFO - Epoch [5][5050/7330] lr: 1.000e-04, eta: 9:45:06, time: 0.644, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0426, loss_cls: 0.1883, acc: 93.1079, loss_bbox: 0.2452, loss_mask: 0.2475, loss: 0.7467 2024-06-01 04:52:43,127 - mmdet - INFO - Epoch [5][5100/7330] lr: 1.000e-04, eta: 9:44:33, time: 0.653, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0517, loss_cls: 0.2023, acc: 92.6553, loss_bbox: 0.2609, loss_mask: 0.2535, loss: 0.7933 2024-06-01 04:53:15,139 - mmdet - INFO - Epoch [5][5150/7330] lr: 1.000e-04, eta: 9:43:59, time: 0.640, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0442, loss_cls: 0.1977, acc: 92.8967, loss_bbox: 0.2531, loss_mask: 0.2509, loss: 0.7691 2024-06-01 04:53:47,396 - mmdet - INFO - Epoch [5][5200/7330] lr: 1.000e-04, eta: 9:43:25, time: 0.645, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0459, loss_cls: 0.1946, acc: 92.9622, loss_bbox: 0.2443, loss_mask: 0.2393, loss: 0.7474 2024-06-01 04:54:19,509 - mmdet - INFO - Epoch [5][5250/7330] lr: 1.000e-04, eta: 9:42:52, time: 0.642, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0446, loss_cls: 0.1967, acc: 92.7627, loss_bbox: 0.2506, loss_mask: 0.2455, loss: 0.7599 2024-06-01 04:54:51,695 - mmdet - INFO - Epoch [5][5300/7330] lr: 1.000e-04, eta: 9:42:18, time: 0.644, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0445, loss_cls: 0.1915, acc: 93.1147, loss_bbox: 0.2394, loss_mask: 0.2480, loss: 0.7470 2024-06-01 04:55:28,254 - mmdet - INFO - Epoch [5][5350/7330] lr: 1.000e-04, eta: 9:41:51, time: 0.731, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0418, loss_cls: 0.1855, acc: 93.3799, loss_bbox: 0.2364, loss_mask: 0.2449, loss: 0.7300 2024-06-01 04:56:02,336 - mmdet - INFO - Epoch [5][5400/7330] lr: 1.000e-04, eta: 9:41:20, time: 0.682, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0437, loss_cls: 0.1972, acc: 92.9363, loss_bbox: 0.2448, loss_mask: 0.2484, loss: 0.7569 2024-06-01 04:56:39,223 - mmdet - INFO - Epoch [5][5450/7330] lr: 1.000e-04, eta: 9:40:54, time: 0.738, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0414, loss_cls: 0.1955, acc: 92.9019, loss_bbox: 0.2446, loss_mask: 0.2439, loss: 0.7459 2024-06-01 04:57:14,090 - mmdet - INFO - Epoch [5][5500/7330] lr: 1.000e-04, eta: 9:40:24, time: 0.697, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0468, loss_cls: 0.1931, acc: 93.0447, loss_bbox: 0.2462, loss_mask: 0.2478, loss: 0.7563 2024-06-01 04:57:46,429 - mmdet - INFO - Epoch [5][5550/7330] lr: 1.000e-04, eta: 9:39:51, time: 0.647, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0428, loss_cls: 0.1878, acc: 93.0977, loss_bbox: 0.2387, loss_mask: 0.2416, loss: 0.7334 2024-06-01 04:58:18,444 - mmdet - INFO - Epoch [5][5600/7330] lr: 1.000e-04, eta: 9:39:17, time: 0.640, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0460, loss_cls: 0.1996, acc: 92.7161, loss_bbox: 0.2507, loss_mask: 0.2510, loss: 0.7714 2024-06-01 04:58:52,985 - mmdet - INFO - Epoch [5][5650/7330] lr: 1.000e-04, eta: 9:38:47, time: 0.691, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0429, loss_cls: 0.1874, acc: 93.2341, loss_bbox: 0.2415, loss_mask: 0.2403, loss: 0.7334 2024-06-01 04:59:25,345 - mmdet - INFO - Epoch [5][5700/7330] lr: 1.000e-04, eta: 9:38:14, time: 0.647, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0445, loss_cls: 0.1866, acc: 93.2439, loss_bbox: 0.2409, loss_mask: 0.2505, loss: 0.7433 2024-06-01 04:59:59,263 - mmdet - INFO - Epoch [5][5750/7330] lr: 1.000e-04, eta: 9:37:43, time: 0.678, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0444, loss_cls: 0.1899, acc: 93.1677, loss_bbox: 0.2434, loss_mask: 0.2462, loss: 0.7470 2024-06-01 05:00:31,155 - mmdet - INFO - Epoch [5][5800/7330] lr: 1.000e-04, eta: 9:37:09, time: 0.638, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0428, loss_cls: 0.1912, acc: 92.9719, loss_bbox: 0.2496, loss_mask: 0.2467, loss: 0.7520 2024-06-01 05:01:03,269 - mmdet - INFO - Epoch [5][5850/7330] lr: 1.000e-04, eta: 9:36:35, time: 0.642, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0472, loss_cls: 0.2010, acc: 92.7056, loss_bbox: 0.2535, loss_mask: 0.2480, loss: 0.7725 2024-06-01 05:01:35,213 - mmdet - INFO - Epoch [5][5900/7330] lr: 1.000e-04, eta: 9:36:01, time: 0.639, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0434, loss_cls: 0.1970, acc: 92.8540, loss_bbox: 0.2444, loss_mask: 0.2477, loss: 0.7557 2024-06-01 05:02:07,772 - mmdet - INFO - Epoch [5][5950/7330] lr: 1.000e-04, eta: 9:35:28, time: 0.651, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0473, loss_cls: 0.2034, acc: 92.5708, loss_bbox: 0.2621, loss_mask: 0.2490, loss: 0.7850 2024-06-01 05:02:40,412 - mmdet - INFO - Epoch [5][6000/7330] lr: 1.000e-04, eta: 9:34:55, time: 0.653, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0469, loss_cls: 0.1994, acc: 92.7729, loss_bbox: 0.2547, loss_mask: 0.2469, loss: 0.7718 2024-06-01 05:03:12,831 - mmdet - INFO - Epoch [5][6050/7330] lr: 1.000e-04, eta: 9:34:22, time: 0.648, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0467, loss_cls: 0.2087, acc: 92.4648, loss_bbox: 0.2615, loss_mask: 0.2512, loss: 0.7922 2024-06-01 05:03:44,629 - mmdet - INFO - Epoch [5][6100/7330] lr: 1.000e-04, eta: 9:33:47, time: 0.636, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0409, loss_cls: 0.1897, acc: 93.3230, loss_bbox: 0.2357, loss_mask: 0.2426, loss: 0.7308 2024-06-01 05:04:17,141 - mmdet - INFO - Epoch [5][6150/7330] lr: 1.000e-04, eta: 9:33:14, time: 0.650, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0429, loss_cls: 0.1882, acc: 93.1851, loss_bbox: 0.2381, loss_mask: 0.2434, loss: 0.7355 2024-06-01 05:04:48,998 - mmdet - INFO - Epoch [5][6200/7330] lr: 1.000e-04, eta: 9:32:40, time: 0.637, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0431, loss_cls: 0.1941, acc: 92.8296, loss_bbox: 0.2488, loss_mask: 0.2450, loss: 0.7534 2024-06-01 05:05:21,199 - mmdet - INFO - Epoch [5][6250/7330] lr: 1.000e-04, eta: 9:32:06, time: 0.645, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0442, loss_cls: 0.1992, acc: 92.7178, loss_bbox: 0.2522, loss_mask: 0.2473, loss: 0.7642 2024-06-01 05:05:53,459 - mmdet - INFO - Epoch [5][6300/7330] lr: 1.000e-04, eta: 9:31:33, time: 0.645, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0443, loss_cls: 0.1936, acc: 92.9465, loss_bbox: 0.2460, loss_mask: 0.2452, loss: 0.7528 2024-06-01 05:06:25,605 - mmdet - INFO - Epoch [5][6350/7330] lr: 1.000e-04, eta: 9:30:59, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0450, loss_cls: 0.1954, acc: 92.9600, loss_bbox: 0.2449, loss_mask: 0.2431, loss: 0.7502 2024-06-01 05:07:00,226 - mmdet - INFO - Epoch [5][6400/7330] lr: 1.000e-04, eta: 9:30:29, time: 0.692, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0434, loss_cls: 0.1960, acc: 92.7786, loss_bbox: 0.2455, loss_mask: 0.2443, loss: 0.7532 2024-06-01 05:07:37,310 - mmdet - INFO - Epoch [5][6450/7330] lr: 1.000e-04, eta: 9:30:03, time: 0.742, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0450, loss_cls: 0.1884, acc: 93.0955, loss_bbox: 0.2462, loss_mask: 0.2511, loss: 0.7525 2024-06-01 05:08:12,836 - mmdet - INFO - Epoch [5][6500/7330] lr: 1.000e-04, eta: 9:29:34, time: 0.711, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0462, loss_cls: 0.2113, acc: 92.3967, loss_bbox: 0.2611, loss_mask: 0.2547, loss: 0.7977 2024-06-01 05:08:49,398 - mmdet - INFO - Epoch [5][6550/7330] lr: 1.000e-04, eta: 9:29:07, time: 0.731, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0420, loss_cls: 0.1856, acc: 93.1953, loss_bbox: 0.2355, loss_mask: 0.2413, loss: 0.7260 2024-06-01 05:09:22,051 - mmdet - INFO - Epoch [5][6600/7330] lr: 1.000e-04, eta: 9:28:34, time: 0.653, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0449, loss_cls: 0.1953, acc: 92.8140, loss_bbox: 0.2477, loss_mask: 0.2466, loss: 0.7576 2024-06-01 05:09:54,021 - mmdet - INFO - Epoch [5][6650/7330] lr: 1.000e-04, eta: 9:28:00, time: 0.639, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0453, loss_cls: 0.1926, acc: 92.9873, loss_bbox: 0.2420, loss_mask: 0.2462, loss: 0.7484 2024-06-01 05:10:25,968 - mmdet - INFO - Epoch [5][6700/7330] lr: 1.000e-04, eta: 9:27:26, time: 0.639, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0427, loss_cls: 0.1888, acc: 93.1218, loss_bbox: 0.2396, loss_mask: 0.2391, loss: 0.7310 2024-06-01 05:11:00,371 - mmdet - INFO - Epoch [5][6750/7330] lr: 1.000e-04, eta: 9:26:56, time: 0.688, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0454, loss_cls: 0.1957, acc: 92.9524, loss_bbox: 0.2410, loss_mask: 0.2391, loss: 0.7439 2024-06-01 05:11:32,524 - mmdet - INFO - Epoch [5][6800/7330] lr: 1.000e-04, eta: 9:26:22, time: 0.643, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0458, loss_cls: 0.2051, acc: 92.6526, loss_bbox: 0.2509, loss_mask: 0.2419, loss: 0.7671 2024-06-01 05:12:06,925 - mmdet - INFO - Epoch [5][6850/7330] lr: 1.000e-04, eta: 9:25:51, time: 0.688, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0482, loss_cls: 0.2031, acc: 92.6816, loss_bbox: 0.2597, loss_mask: 0.2523, loss: 0.7870 2024-06-01 05:12:38,897 - mmdet - INFO - Epoch [5][6900/7330] lr: 1.000e-04, eta: 9:25:17, time: 0.639, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0409, loss_cls: 0.1915, acc: 93.2822, loss_bbox: 0.2411, loss_mask: 0.2454, loss: 0.7401 2024-06-01 05:13:11,430 - mmdet - INFO - Epoch [5][6950/7330] lr: 1.000e-04, eta: 9:24:44, time: 0.651, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0451, loss_cls: 0.1907, acc: 92.9583, loss_bbox: 0.2417, loss_mask: 0.2383, loss: 0.7374 2024-06-01 05:13:43,636 - mmdet - INFO - Epoch [5][7000/7330] lr: 1.000e-04, eta: 9:24:11, time: 0.644, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0452, loss_cls: 0.1921, acc: 92.9380, loss_bbox: 0.2442, loss_mask: 0.2504, loss: 0.7546 2024-06-01 05:14:15,924 - mmdet - INFO - Epoch [5][7050/7330] lr: 1.000e-04, eta: 9:23:37, time: 0.646, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0453, loss_cls: 0.1914, acc: 93.0857, loss_bbox: 0.2414, loss_mask: 0.2430, loss: 0.7438 2024-06-01 05:14:48,207 - mmdet - INFO - Epoch [5][7100/7330] lr: 1.000e-04, eta: 9:23:04, time: 0.646, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0449, loss_cls: 0.1950, acc: 92.8762, loss_bbox: 0.2515, loss_mask: 0.2546, loss: 0.7681 2024-06-01 05:15:20,182 - mmdet - INFO - Epoch [5][7150/7330] lr: 1.000e-04, eta: 9:22:30, time: 0.639, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0449, loss_cls: 0.1948, acc: 92.9912, loss_bbox: 0.2459, loss_mask: 0.2450, loss: 0.7529 2024-06-01 05:15:51,748 - mmdet - INFO - Epoch [5][7200/7330] lr: 1.000e-04, eta: 9:21:55, time: 0.632, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0431, loss_cls: 0.1937, acc: 92.9487, loss_bbox: 0.2459, loss_mask: 0.2513, loss: 0.7555 2024-06-01 05:16:23,951 - mmdet - INFO - Epoch [5][7250/7330] lr: 1.000e-04, eta: 9:21:22, time: 0.644, data_time: 0.039, memory: 13277, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0431, loss_cls: 0.1914, acc: 92.9480, loss_bbox: 0.2466, loss_mask: 0.2411, loss: 0.7441 2024-06-01 05:16:56,367 - mmdet - INFO - Epoch [5][7300/7330] lr: 1.000e-04, eta: 9:20:49, time: 0.648, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0459, loss_cls: 0.2024, acc: 92.6311, loss_bbox: 0.2541, loss_mask: 0.2507, loss: 0.7761 2024-06-01 05:17:16,461 - mmdet - INFO - Saving checkpoint at 5 epochs 2024-06-01 05:19:02,744 - mmdet - INFO - Evaluating bbox... 2024-06-01 05:19:26,926 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.438 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.681 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.479 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.265 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.483 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.601 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.560 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.560 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.560 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.359 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.612 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.730 2024-06-01 05:19:26,926 - mmdet - INFO - Evaluating segm... 2024-06-01 05:19:54,094 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.395 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.643 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.419 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.179 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.432 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.617 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.505 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.505 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.505 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.287 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.561 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.701 2024-06-01 05:19:54,516 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 05:19:54,518 - mmdet - INFO - Epoch(val) [5][625] bbox_mAP: 0.4380, bbox_mAP_50: 0.6810, bbox_mAP_75: 0.4790, bbox_mAP_s: 0.2650, bbox_mAP_m: 0.4830, bbox_mAP_l: 0.6010, bbox_mAP_copypaste: 0.438 0.681 0.479 0.265 0.483 0.601, segm_mAP: 0.3950, segm_mAP_50: 0.6430, segm_mAP_75: 0.4190, segm_mAP_s: 0.1790, segm_mAP_m: 0.4320, segm_mAP_l: 0.6170, segm_mAP_copypaste: 0.395 0.643 0.419 0.179 0.432 0.617 2024-06-01 05:20:30,645 - mmdet - INFO - Epoch [6][50/7330] lr: 1.000e-04, eta: 9:19:33, time: 0.722, data_time: 0.097, memory: 13277, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0407, loss_cls: 0.1779, acc: 93.4104, loss_bbox: 0.2393, loss_mask: 0.2348, loss: 0.7113 2024-06-01 05:21:05,798 - mmdet - INFO - Epoch [6][100/7330] lr: 1.000e-04, eta: 9:19:04, time: 0.703, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0423, loss_cls: 0.1874, acc: 93.1565, loss_bbox: 0.2441, loss_mask: 0.2409, loss: 0.7354 2024-06-01 05:21:40,068 - mmdet - INFO - Epoch [6][150/7330] lr: 1.000e-04, eta: 9:18:33, time: 0.685, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0459, loss_cls: 0.1921, acc: 92.8723, loss_bbox: 0.2488, loss_mask: 0.2451, loss: 0.7529 2024-06-01 05:22:18,685 - mmdet - INFO - Epoch [6][200/7330] lr: 1.000e-04, eta: 9:18:09, time: 0.772, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0432, loss_cls: 0.1895, acc: 92.9880, loss_bbox: 0.2454, loss_mask: 0.2442, loss: 0.7403 2024-06-01 05:22:51,072 - mmdet - INFO - Epoch [6][250/7330] lr: 1.000e-04, eta: 9:17:35, time: 0.648, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0459, loss_cls: 0.1860, acc: 93.1370, loss_bbox: 0.2420, loss_mask: 0.2442, loss: 0.7379 2024-06-01 05:23:23,262 - mmdet - INFO - Epoch [6][300/7330] lr: 1.000e-04, eta: 9:17:02, time: 0.644, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0454, loss_cls: 0.1923, acc: 92.8521, loss_bbox: 0.2472, loss_mask: 0.2435, loss: 0.7508 2024-06-01 05:23:55,082 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 05:23:55,082 - mmdet - INFO - Epoch [6][350/7330] lr: 1.000e-04, eta: 9:16:28, time: 0.636, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0438, loss_cls: 0.1833, acc: 93.2485, loss_bbox: 0.2406, loss_mask: 0.2498, loss: 0.7393 2024-06-01 05:24:27,361 - mmdet - INFO - Epoch [6][400/7330] lr: 1.000e-04, eta: 9:15:54, time: 0.646, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0414, loss_cls: 0.1821, acc: 93.4092, loss_bbox: 0.2343, loss_mask: 0.2414, loss: 0.7198 2024-06-01 05:25:02,281 - mmdet - INFO - Epoch [6][450/7330] lr: 1.000e-04, eta: 9:15:25, time: 0.698, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0430, loss_cls: 0.1892, acc: 92.9756, loss_bbox: 0.2448, loss_mask: 0.2421, loss: 0.7399 2024-06-01 05:25:34,536 - mmdet - INFO - Epoch [6][500/7330] lr: 1.000e-04, eta: 9:14:51, time: 0.645, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0448, loss_cls: 0.1946, acc: 92.8892, loss_bbox: 0.2476, loss_mask: 0.2396, loss: 0.7490 2024-06-01 05:26:06,594 - mmdet - INFO - Epoch [6][550/7330] lr: 1.000e-04, eta: 9:14:17, time: 0.641, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0422, loss_cls: 0.1826, acc: 93.3086, loss_bbox: 0.2323, loss_mask: 0.2358, loss: 0.7132 2024-06-01 05:26:38,795 - mmdet - INFO - Epoch [6][600/7330] lr: 1.000e-04, eta: 9:13:44, time: 0.644, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0453, loss_cls: 0.1911, acc: 92.9248, loss_bbox: 0.2461, loss_mask: 0.2469, loss: 0.7508 2024-06-01 05:27:10,601 - mmdet - INFO - Epoch [6][650/7330] lr: 1.000e-04, eta: 9:13:10, time: 0.636, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0414, loss_cls: 0.1840, acc: 93.3083, loss_bbox: 0.2435, loss_mask: 0.2338, loss: 0.7209 2024-06-01 05:27:42,452 - mmdet - INFO - Epoch [6][700/7330] lr: 1.000e-04, eta: 9:12:36, time: 0.637, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0454, loss_cls: 0.1892, acc: 93.0947, loss_bbox: 0.2428, loss_mask: 0.2427, loss: 0.7399 2024-06-01 05:28:14,468 - mmdet - INFO - Epoch [6][750/7330] lr: 1.000e-04, eta: 9:12:02, time: 0.640, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0385, loss_cls: 0.1690, acc: 93.8760, loss_bbox: 0.2243, loss_mask: 0.2388, loss: 0.6891 2024-06-01 05:28:46,861 - mmdet - INFO - Epoch [6][800/7330] lr: 1.000e-04, eta: 9:11:29, time: 0.648, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0432, loss_cls: 0.1820, acc: 93.3782, loss_bbox: 0.2369, loss_mask: 0.2392, loss: 0.7214 2024-06-01 05:29:19,110 - mmdet - INFO - Epoch [6][850/7330] lr: 1.000e-04, eta: 9:10:55, time: 0.645, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0410, loss_cls: 0.1787, acc: 93.3977, loss_bbox: 0.2308, loss_mask: 0.2393, loss: 0.7094 2024-06-01 05:29:50,983 - mmdet - INFO - Epoch [6][900/7330] lr: 1.000e-04, eta: 9:10:21, time: 0.637, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0408, loss_cls: 0.1722, acc: 93.6453, loss_bbox: 0.2268, loss_mask: 0.2385, loss: 0.6970 2024-06-01 05:30:23,178 - mmdet - INFO - Epoch [6][950/7330] lr: 1.000e-04, eta: 9:09:48, time: 0.644, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0422, loss_cls: 0.1776, acc: 93.4609, loss_bbox: 0.2292, loss_mask: 0.2358, loss: 0.7048 2024-06-01 05:30:55,472 - mmdet - INFO - Epoch [6][1000/7330] lr: 1.000e-04, eta: 9:09:15, time: 0.646, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0419, loss_cls: 0.1837, acc: 93.3311, loss_bbox: 0.2400, loss_mask: 0.2382, loss: 0.7238 2024-06-01 05:31:27,484 - mmdet - INFO - Epoch [6][1050/7330] lr: 1.000e-04, eta: 9:08:41, time: 0.640, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0416, loss_cls: 0.1868, acc: 93.1990, loss_bbox: 0.2397, loss_mask: 0.2399, loss: 0.7291 2024-06-01 05:31:59,653 - mmdet - INFO - Epoch [6][1100/7330] lr: 1.000e-04, eta: 9:08:07, time: 0.643, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0448, loss_cls: 0.1892, acc: 93.0627, loss_bbox: 0.2462, loss_mask: 0.2468, loss: 0.7474 2024-06-01 05:32:33,616 - mmdet - INFO - Epoch [6][1150/7330] lr: 1.000e-04, eta: 9:07:36, time: 0.679, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0438, loss_cls: 0.1833, acc: 93.3313, loss_bbox: 0.2337, loss_mask: 0.2449, loss: 0.7260 2024-06-01 05:33:11,452 - mmdet - INFO - Epoch [6][1200/7330] lr: 1.000e-04, eta: 9:07:10, time: 0.757, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0433, loss_cls: 0.1884, acc: 93.0991, loss_bbox: 0.2427, loss_mask: 0.2439, loss: 0.7392 2024-06-01 05:33:50,495 - mmdet - INFO - Epoch [6][1250/7330] lr: 1.000e-04, eta: 9:06:46, time: 0.781, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0417, loss_cls: 0.1861, acc: 93.1484, loss_bbox: 0.2385, loss_mask: 0.2398, loss: 0.7252 2024-06-01 05:34:24,965 - mmdet - INFO - Epoch [6][1300/7330] lr: 1.000e-04, eta: 9:06:15, time: 0.689, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0469, loss_cls: 0.1927, acc: 92.9175, loss_bbox: 0.2487, loss_mask: 0.2480, loss: 0.7574 2024-06-01 05:34:57,012 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 05:34:57,012 - mmdet - INFO - Epoch [6][1350/7330] lr: 1.000e-04, eta: 9:05:41, time: 0.641, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0432, loss_cls: 0.1915, acc: 92.9268, loss_bbox: 0.2477, loss_mask: 0.2457, loss: 0.7492 2024-06-01 05:35:29,154 - mmdet - INFO - Epoch [6][1400/7330] lr: 1.000e-04, eta: 9:05:08, time: 0.643, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0455, loss_cls: 0.1908, acc: 93.0645, loss_bbox: 0.2421, loss_mask: 0.2427, loss: 0.7416 2024-06-01 05:36:01,334 - mmdet - INFO - Epoch [6][1450/7330] lr: 1.000e-04, eta: 9:04:34, time: 0.644, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0447, loss_cls: 0.1906, acc: 92.9600, loss_bbox: 0.2476, loss_mask: 0.2396, loss: 0.7438 2024-06-01 05:36:33,684 - mmdet - INFO - Epoch [6][1500/7330] lr: 1.000e-04, eta: 9:04:01, time: 0.647, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0440, loss_cls: 0.1873, acc: 93.1189, loss_bbox: 0.2418, loss_mask: 0.2391, loss: 0.7340 2024-06-01 05:37:09,675 - mmdet - INFO - Epoch [6][1550/7330] lr: 1.000e-04, eta: 9:03:32, time: 0.720, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0415, loss_cls: 0.1857, acc: 93.2625, loss_bbox: 0.2372, loss_mask: 0.2402, loss: 0.7237 2024-06-01 05:37:41,871 - mmdet - INFO - Epoch [6][1600/7330] lr: 1.000e-04, eta: 9:02:59, time: 0.644, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0420, loss_cls: 0.1865, acc: 93.1646, loss_bbox: 0.2418, loss_mask: 0.2414, loss: 0.7317 2024-06-01 05:38:14,522 - mmdet - INFO - Epoch [6][1650/7330] lr: 1.000e-04, eta: 9:02:26, time: 0.653, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0407, loss_cls: 0.1873, acc: 93.0955, loss_bbox: 0.2441, loss_mask: 0.2406, loss: 0.7320 2024-06-01 05:38:46,887 - mmdet - INFO - Epoch [6][1700/7330] lr: 1.000e-04, eta: 9:01:53, time: 0.647, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0462, loss_cls: 0.1861, acc: 93.2336, loss_bbox: 0.2371, loss_mask: 0.2458, loss: 0.7386 2024-06-01 05:39:18,856 - mmdet - INFO - Epoch [6][1750/7330] lr: 1.000e-04, eta: 9:01:19, time: 0.639, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0410, loss_cls: 0.1851, acc: 93.2354, loss_bbox: 0.2402, loss_mask: 0.2374, loss: 0.7235 2024-06-01 05:39:50,963 - mmdet - INFO - Epoch [6][1800/7330] lr: 1.000e-04, eta: 9:00:45, time: 0.642, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0410, loss_cls: 0.1847, acc: 93.1597, loss_bbox: 0.2341, loss_mask: 0.2430, loss: 0.7219 2024-06-01 05:40:23,228 - mmdet - INFO - Epoch [6][1850/7330] lr: 1.000e-04, eta: 9:00:12, time: 0.645, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0437, loss_cls: 0.1875, acc: 93.0601, loss_bbox: 0.2454, loss_mask: 0.2445, loss: 0.7421 2024-06-01 05:40:55,415 - mmdet - INFO - Epoch [6][1900/7330] lr: 1.000e-04, eta: 8:59:38, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0413, loss_cls: 0.1842, acc: 93.1953, loss_bbox: 0.2412, loss_mask: 0.2389, loss: 0.7247 2024-06-01 05:41:27,568 - mmdet - INFO - Epoch [6][1950/7330] lr: 1.000e-04, eta: 8:59:05, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0427, loss_cls: 0.1838, acc: 93.2542, loss_bbox: 0.2383, loss_mask: 0.2389, loss: 0.7247 2024-06-01 05:41:59,644 - mmdet - INFO - Epoch [6][2000/7330] lr: 1.000e-04, eta: 8:58:31, time: 0.642, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0426, loss_cls: 0.1807, acc: 93.3054, loss_bbox: 0.2356, loss_mask: 0.2364, loss: 0.7152 2024-06-01 05:42:31,797 - mmdet - INFO - Epoch [6][2050/7330] lr: 1.000e-04, eta: 8:57:58, time: 0.643, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0431, loss_cls: 0.1825, acc: 93.2610, loss_bbox: 0.2392, loss_mask: 0.2403, loss: 0.7259 2024-06-01 05:43:03,576 - mmdet - INFO - Epoch [6][2100/7330] lr: 1.000e-04, eta: 8:57:24, time: 0.635, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0409, loss_cls: 0.1801, acc: 93.4180, loss_bbox: 0.2306, loss_mask: 0.2340, loss: 0.7053 2024-06-01 05:43:35,759 - mmdet - INFO - Epoch [6][2150/7330] lr: 1.000e-04, eta: 8:56:50, time: 0.644, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0435, loss_cls: 0.1825, acc: 93.3777, loss_bbox: 0.2339, loss_mask: 0.2399, loss: 0.7208 2024-06-01 05:44:10,041 - mmdet - INFO - Epoch [6][2200/7330] lr: 1.000e-04, eta: 8:56:19, time: 0.686, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0390, loss_cls: 0.1843, acc: 93.2898, loss_bbox: 0.2358, loss_mask: 0.2392, loss: 0.7170 2024-06-01 05:44:48,200 - mmdet - INFO - Epoch [6][2250/7330] lr: 1.000e-04, eta: 8:55:53, time: 0.763, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0413, loss_cls: 0.1812, acc: 93.3247, loss_bbox: 0.2394, loss_mask: 0.2406, loss: 0.7238 2024-06-01 05:45:26,797 - mmdet - INFO - Epoch [6][2300/7330] lr: 1.000e-04, eta: 8:55:28, time: 0.772, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0398, loss_cls: 0.1838, acc: 93.3186, loss_bbox: 0.2313, loss_mask: 0.2367, loss: 0.7113 2024-06-01 05:46:01,334 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 05:46:01,334 - mmdet - INFO - Epoch [6][2350/7330] lr: 1.000e-04, eta: 8:54:57, time: 0.691, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0467, loss_cls: 0.1912, acc: 92.9683, loss_bbox: 0.2428, loss_mask: 0.2375, loss: 0.7407 2024-06-01 05:46:33,259 - mmdet - INFO - Epoch [6][2400/7330] lr: 1.000e-04, eta: 8:54:23, time: 0.638, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0426, loss_cls: 0.1835, acc: 93.3013, loss_bbox: 0.2365, loss_mask: 0.2380, loss: 0.7215 2024-06-01 05:47:05,473 - mmdet - INFO - Epoch [6][2450/7330] lr: 1.000e-04, eta: 8:53:50, time: 0.644, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0430, loss_cls: 0.1895, acc: 92.9468, loss_bbox: 0.2485, loss_mask: 0.2385, loss: 0.7400 2024-06-01 05:47:37,740 - mmdet - INFO - Epoch [6][2500/7330] lr: 1.000e-04, eta: 8:53:17, time: 0.645, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0450, loss_cls: 0.1948, acc: 92.8113, loss_bbox: 0.2548, loss_mask: 0.2421, loss: 0.7580 2024-06-01 05:48:09,869 - mmdet - INFO - Epoch [6][2550/7330] lr: 1.000e-04, eta: 8:52:43, time: 0.643, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0428, loss_cls: 0.1934, acc: 92.9316, loss_bbox: 0.2456, loss_mask: 0.2427, loss: 0.7457 2024-06-01 05:48:44,383 - mmdet - INFO - Epoch [6][2600/7330] lr: 1.000e-04, eta: 8:52:12, time: 0.690, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0406, loss_cls: 0.1843, acc: 93.3738, loss_bbox: 0.2389, loss_mask: 0.2387, loss: 0.7209 2024-06-01 05:49:16,612 - mmdet - INFO - Epoch [6][2650/7330] lr: 1.000e-04, eta: 8:51:39, time: 0.645, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0423, loss_cls: 0.1862, acc: 93.1191, loss_bbox: 0.2392, loss_mask: 0.2457, loss: 0.7325 2024-06-01 05:49:48,993 - mmdet - INFO - Epoch [6][2700/7330] lr: 1.000e-04, eta: 8:51:06, time: 0.648, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0437, loss_cls: 0.1895, acc: 93.1331, loss_bbox: 0.2420, loss_mask: 0.2431, loss: 0.7389 2024-06-01 05:50:21,225 - mmdet - INFO - Epoch [6][2750/7330] lr: 1.000e-04, eta: 8:50:32, time: 0.645, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0450, loss_cls: 0.1884, acc: 93.0256, loss_bbox: 0.2464, loss_mask: 0.2433, loss: 0.7449 2024-06-01 05:50:53,361 - mmdet - INFO - Epoch [6][2800/7330] lr: 1.000e-04, eta: 8:49:59, time: 0.643, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0435, loss_cls: 0.1899, acc: 93.0068, loss_bbox: 0.2474, loss_mask: 0.2411, loss: 0.7422 2024-06-01 05:51:25,731 - mmdet - INFO - Epoch [6][2850/7330] lr: 1.000e-04, eta: 8:49:25, time: 0.647, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0419, loss_cls: 0.1949, acc: 92.9905, loss_bbox: 0.2492, loss_mask: 0.2415, loss: 0.7471 2024-06-01 05:51:57,896 - mmdet - INFO - Epoch [6][2900/7330] lr: 1.000e-04, eta: 8:48:52, time: 0.643, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0447, loss_cls: 0.1919, acc: 93.0146, loss_bbox: 0.2486, loss_mask: 0.2452, loss: 0.7522 2024-06-01 05:52:29,872 - mmdet - INFO - Epoch [6][2950/7330] lr: 1.000e-04, eta: 8:48:18, time: 0.640, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0411, loss_cls: 0.1884, acc: 93.0835, loss_bbox: 0.2453, loss_mask: 0.2410, loss: 0.7360 2024-06-01 05:53:02,055 - mmdet - INFO - Epoch [6][3000/7330] lr: 1.000e-04, eta: 8:47:45, time: 0.644, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0421, loss_cls: 0.1823, acc: 93.3511, loss_bbox: 0.2347, loss_mask: 0.2360, loss: 0.7165 2024-06-01 05:53:33,952 - mmdet - INFO - Epoch [6][3050/7330] lr: 1.000e-04, eta: 8:47:11, time: 0.638, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0410, loss_cls: 0.1757, acc: 93.5398, loss_bbox: 0.2287, loss_mask: 0.2391, loss: 0.7037 2024-06-01 05:54:06,041 - mmdet - INFO - Epoch [6][3100/7330] lr: 1.000e-04, eta: 8:46:37, time: 0.642, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0419, loss_cls: 0.1901, acc: 93.0366, loss_bbox: 0.2395, loss_mask: 0.2408, loss: 0.7332 2024-06-01 05:54:38,044 - mmdet - INFO - Epoch [6][3150/7330] lr: 1.000e-04, eta: 8:46:03, time: 0.640, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0420, loss_cls: 0.1862, acc: 93.2224, loss_bbox: 0.2380, loss_mask: 0.2391, loss: 0.7269 2024-06-01 05:55:10,077 - mmdet - INFO - Epoch [6][3200/7330] lr: 1.000e-04, eta: 8:45:30, time: 0.641, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0434, loss_cls: 0.1850, acc: 93.1594, loss_bbox: 0.2388, loss_mask: 0.2397, loss: 0.7297 2024-06-01 05:55:44,119 - mmdet - INFO - Epoch [6][3250/7330] lr: 1.000e-04, eta: 8:44:59, time: 0.681, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0399, loss_cls: 0.1736, acc: 93.6213, loss_bbox: 0.2292, loss_mask: 0.2366, loss: 0.6994 2024-06-01 05:56:16,141 - mmdet - INFO - Epoch [6][3300/7330] lr: 1.000e-04, eta: 8:44:25, time: 0.640, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0408, loss_cls: 0.1816, acc: 93.3530, loss_bbox: 0.2374, loss_mask: 0.2386, loss: 0.7177 2024-06-01 05:56:58,545 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 05:56:58,545 - mmdet - INFO - Epoch [6][3350/7330] lr: 1.000e-04, eta: 8:44:04, time: 0.848, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0454, loss_cls: 0.1852, acc: 93.2078, loss_bbox: 0.2393, loss_mask: 0.2467, loss: 0.7382 2024-06-01 05:57:35,560 - mmdet - INFO - Epoch [6][3400/7330] lr: 1.000e-04, eta: 8:43:36, time: 0.740, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0442, loss_cls: 0.1900, acc: 93.0493, loss_bbox: 0.2439, loss_mask: 0.2461, loss: 0.7451 2024-06-01 05:58:08,212 - mmdet - INFO - Epoch [6][3450/7330] lr: 1.000e-04, eta: 8:43:03, time: 0.653, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0458, loss_cls: 0.1894, acc: 92.9470, loss_bbox: 0.2473, loss_mask: 0.2494, loss: 0.7518 2024-06-01 05:58:40,416 - mmdet - INFO - Epoch [6][3500/7330] lr: 1.000e-04, eta: 8:42:30, time: 0.644, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0447, loss_cls: 0.1937, acc: 92.9536, loss_bbox: 0.2466, loss_mask: 0.2421, loss: 0.7480 2024-06-01 05:59:12,585 - mmdet - INFO - Epoch [6][3550/7330] lr: 1.000e-04, eta: 8:41:56, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0430, loss_cls: 0.1877, acc: 93.0303, loss_bbox: 0.2425, loss_mask: 0.2386, loss: 0.7329 2024-06-01 05:59:45,029 - mmdet - INFO - Epoch [6][3600/7330] lr: 1.000e-04, eta: 8:41:23, time: 0.649, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0443, loss_cls: 0.1973, acc: 92.6409, loss_bbox: 0.2558, loss_mask: 0.2425, loss: 0.7613 2024-06-01 06:00:19,366 - mmdet - INFO - Epoch [6][3650/7330] lr: 1.000e-04, eta: 8:40:52, time: 0.687, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0438, loss_cls: 0.1880, acc: 93.1758, loss_bbox: 0.2415, loss_mask: 0.2347, loss: 0.7276 2024-06-01 06:00:51,165 - mmdet - INFO - Epoch [6][3700/7330] lr: 1.000e-04, eta: 8:40:18, time: 0.636, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0397, loss_cls: 0.1795, acc: 93.4590, loss_bbox: 0.2301, loss_mask: 0.2358, loss: 0.7051 2024-06-01 06:01:23,353 - mmdet - INFO - Epoch [6][3750/7330] lr: 1.000e-04, eta: 8:39:44, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0416, loss_cls: 0.1800, acc: 93.3870, loss_bbox: 0.2314, loss_mask: 0.2427, loss: 0.7165 2024-06-01 06:01:55,502 - mmdet - INFO - Epoch [6][3800/7330] lr: 1.000e-04, eta: 8:39:11, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0427, loss_cls: 0.1867, acc: 93.1624, loss_bbox: 0.2415, loss_mask: 0.2398, loss: 0.7316 2024-06-01 06:02:27,663 - mmdet - INFO - Epoch [6][3850/7330] lr: 1.000e-04, eta: 8:38:37, time: 0.643, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0439, loss_cls: 0.1864, acc: 93.2361, loss_bbox: 0.2349, loss_mask: 0.2358, loss: 0.7244 2024-06-01 06:02:59,801 - mmdet - INFO - Epoch [6][3900/7330] lr: 1.000e-04, eta: 8:38:04, time: 0.643, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0439, loss_cls: 0.1847, acc: 93.2583, loss_bbox: 0.2374, loss_mask: 0.2426, loss: 0.7304 2024-06-01 06:03:31,680 - mmdet - INFO - Epoch [6][3950/7330] lr: 1.000e-04, eta: 8:37:30, time: 0.638, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0423, loss_cls: 0.1768, acc: 93.5005, loss_bbox: 0.2307, loss_mask: 0.2432, loss: 0.7123 2024-06-01 06:04:04,132 - mmdet - INFO - Epoch [6][4000/7330] lr: 1.000e-04, eta: 8:36:57, time: 0.649, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0448, loss_cls: 0.1884, acc: 93.0708, loss_bbox: 0.2404, loss_mask: 0.2372, loss: 0.7329 2024-06-01 06:04:36,211 - mmdet - INFO - Epoch [6][4050/7330] lr: 1.000e-04, eta: 8:36:23, time: 0.642, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0453, loss_cls: 0.1962, acc: 92.9180, loss_bbox: 0.2471, loss_mask: 0.2452, loss: 0.7547 2024-06-01 06:05:08,512 - mmdet - INFO - Epoch [6][4100/7330] lr: 1.000e-04, eta: 8:35:50, time: 0.646, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0428, loss_cls: 0.1790, acc: 93.4399, loss_bbox: 0.2269, loss_mask: 0.2348, loss: 0.7033 2024-06-01 06:05:40,548 - mmdet - INFO - Epoch [6][4150/7330] lr: 1.000e-04, eta: 8:35:16, time: 0.641, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0419, loss_cls: 0.1793, acc: 93.2991, loss_bbox: 0.2358, loss_mask: 0.2343, loss: 0.7112 2024-06-01 06:06:12,388 - mmdet - INFO - Epoch [6][4200/7330] lr: 1.000e-04, eta: 8:34:42, time: 0.637, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0434, loss_cls: 0.1823, acc: 93.1523, loss_bbox: 0.2368, loss_mask: 0.2374, loss: 0.7220 2024-06-01 06:06:44,208 - mmdet - INFO - Epoch [6][4250/7330] lr: 1.000e-04, eta: 8:34:09, time: 0.636, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0434, loss_cls: 0.1854, acc: 93.1482, loss_bbox: 0.2404, loss_mask: 0.2485, loss: 0.7386 2024-06-01 06:07:16,417 - mmdet - INFO - Epoch [6][4300/7330] lr: 1.000e-04, eta: 8:33:35, time: 0.644, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0430, loss_cls: 0.1938, acc: 92.9102, loss_bbox: 0.2446, loss_mask: 0.2434, loss: 0.7462 2024-06-01 06:07:50,358 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 06:07:50,358 - mmdet - INFO - Epoch [6][4350/7330] lr: 1.000e-04, eta: 8:33:04, time: 0.679, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0441, loss_cls: 0.1883, acc: 93.0303, loss_bbox: 0.2424, loss_mask: 0.2459, loss: 0.7422 2024-06-01 06:08:27,351 - mmdet - INFO - Epoch [6][4400/7330] lr: 1.000e-04, eta: 8:32:36, time: 0.740, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0429, loss_cls: 0.1828, acc: 93.2881, loss_bbox: 0.2358, loss_mask: 0.2377, loss: 0.7202 2024-06-01 06:09:06,305 - mmdet - INFO - Epoch [6][4450/7330] lr: 1.000e-04, eta: 8:32:10, time: 0.779, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0446, loss_cls: 0.1965, acc: 92.8977, loss_bbox: 0.2528, loss_mask: 0.2521, loss: 0.7683 2024-06-01 06:09:41,369 - mmdet - INFO - Epoch [6][4500/7330] lr: 1.000e-04, eta: 8:31:40, time: 0.701, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0440, loss_cls: 0.1901, acc: 93.0825, loss_bbox: 0.2413, loss_mask: 0.2406, loss: 0.7368 2024-06-01 06:10:13,267 - mmdet - INFO - Epoch [6][4550/7330] lr: 1.000e-04, eta: 8:31:06, time: 0.638, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0389, loss_cls: 0.1725, acc: 93.7261, loss_bbox: 0.2176, loss_mask: 0.2324, loss: 0.6792 2024-06-01 06:10:45,764 - mmdet - INFO - Epoch [6][4600/7330] lr: 1.000e-04, eta: 8:30:33, time: 0.650, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0435, loss_cls: 0.1822, acc: 93.2166, loss_bbox: 0.2359, loss_mask: 0.2390, loss: 0.7211 2024-06-01 06:11:17,775 - mmdet - INFO - Epoch [6][4650/7330] lr: 1.000e-04, eta: 8:29:59, time: 0.640, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0422, loss_cls: 0.1893, acc: 93.1882, loss_bbox: 0.2359, loss_mask: 0.2436, loss: 0.7310 2024-06-01 06:11:49,970 - mmdet - INFO - Epoch [6][4700/7330] lr: 1.000e-04, eta: 8:29:26, time: 0.644, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0435, loss_cls: 0.1882, acc: 93.1877, loss_bbox: 0.2337, loss_mask: 0.2370, loss: 0.7246 2024-06-01 06:12:24,834 - mmdet - INFO - Epoch [6][4750/7330] lr: 1.000e-04, eta: 8:28:55, time: 0.697, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0462, loss_cls: 0.1918, acc: 93.0593, loss_bbox: 0.2408, loss_mask: 0.2376, loss: 0.7413 2024-06-01 06:12:57,374 - mmdet - INFO - Epoch [6][4800/7330] lr: 1.000e-04, eta: 8:28:22, time: 0.651, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0441, loss_cls: 0.1938, acc: 92.9431, loss_bbox: 0.2454, loss_mask: 0.2435, loss: 0.7493 2024-06-01 06:13:29,846 - mmdet - INFO - Epoch [6][4850/7330] lr: 1.000e-04, eta: 8:27:49, time: 0.649, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0429, loss_cls: 0.1903, acc: 92.9929, loss_bbox: 0.2417, loss_mask: 0.2403, loss: 0.7363 2024-06-01 06:14:01,869 - mmdet - INFO - Epoch [6][4900/7330] lr: 1.000e-04, eta: 8:27:15, time: 0.640, data_time: 0.042, memory: 13277, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0417, loss_cls: 0.1817, acc: 93.3242, loss_bbox: 0.2342, loss_mask: 0.2430, loss: 0.7195 2024-06-01 06:14:33,998 - mmdet - INFO - Epoch [6][4950/7330] lr: 1.000e-04, eta: 8:26:42, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0430, loss_cls: 0.1889, acc: 93.0681, loss_bbox: 0.2449, loss_mask: 0.2478, loss: 0.7443 2024-06-01 06:15:06,379 - mmdet - INFO - Epoch [6][5000/7330] lr: 1.000e-04, eta: 8:26:09, time: 0.648, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0466, loss_cls: 0.1946, acc: 92.9236, loss_bbox: 0.2470, loss_mask: 0.2422, loss: 0.7533 2024-06-01 06:15:38,452 - mmdet - INFO - Epoch [6][5050/7330] lr: 1.000e-04, eta: 8:25:35, time: 0.642, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0431, loss_cls: 0.1862, acc: 93.1565, loss_bbox: 0.2418, loss_mask: 0.2427, loss: 0.7334 2024-06-01 06:16:10,624 - mmdet - INFO - Epoch [6][5100/7330] lr: 1.000e-04, eta: 8:25:02, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0429, loss_cls: 0.1872, acc: 93.2402, loss_bbox: 0.2358, loss_mask: 0.2452, loss: 0.7329 2024-06-01 06:16:43,216 - mmdet - INFO - Epoch [6][5150/7330] lr: 1.000e-04, eta: 8:24:29, time: 0.652, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0428, loss_cls: 0.1832, acc: 93.3103, loss_bbox: 0.2348, loss_mask: 0.2449, loss: 0.7266 2024-06-01 06:17:15,510 - mmdet - INFO - Epoch [6][5200/7330] lr: 1.000e-04, eta: 8:23:55, time: 0.646, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0446, loss_cls: 0.1927, acc: 92.7607, loss_bbox: 0.2495, loss_mask: 0.2440, loss: 0.7516 2024-06-01 06:17:47,497 - mmdet - INFO - Epoch [6][5250/7330] lr: 1.000e-04, eta: 8:23:22, time: 0.640, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0409, loss_cls: 0.1771, acc: 93.5876, loss_bbox: 0.2307, loss_mask: 0.2347, loss: 0.7017 2024-06-01 06:18:19,603 - mmdet - INFO - Epoch [6][5300/7330] lr: 1.000e-04, eta: 8:22:48, time: 0.642, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0418, loss_cls: 0.1834, acc: 93.3928, loss_bbox: 0.2295, loss_mask: 0.2372, loss: 0.7136 2024-06-01 06:18:52,066 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 06:18:52,066 - mmdet - INFO - Epoch [6][5350/7330] lr: 1.000e-04, eta: 8:22:15, time: 0.649, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0436, loss_cls: 0.1905, acc: 93.0061, loss_bbox: 0.2434, loss_mask: 0.2398, loss: 0.7386 2024-06-01 06:19:26,481 - mmdet - INFO - Epoch [6][5400/7330] lr: 1.000e-04, eta: 8:21:44, time: 0.688, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0449, loss_cls: 0.1942, acc: 92.8494, loss_bbox: 0.2478, loss_mask: 0.2437, loss: 0.7518 2024-06-01 06:20:03,345 - mmdet - INFO - Epoch [6][5450/7330] lr: 1.000e-04, eta: 8:21:15, time: 0.737, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0419, loss_cls: 0.1735, acc: 93.6501, loss_bbox: 0.2240, loss_mask: 0.2392, loss: 0.6986 2024-06-01 06:20:40,100 - mmdet - INFO - Epoch [6][5500/7330] lr: 1.000e-04, eta: 8:20:47, time: 0.735, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0430, loss_cls: 0.1930, acc: 92.8723, loss_bbox: 0.2470, loss_mask: 0.2420, loss: 0.7467 2024-06-01 06:21:16,707 - mmdet - INFO - Epoch [6][5550/7330] lr: 1.000e-04, eta: 8:20:18, time: 0.732, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0443, loss_cls: 0.1866, acc: 93.1328, loss_bbox: 0.2409, loss_mask: 0.2383, loss: 0.7316 2024-06-01 06:21:48,835 - mmdet - INFO - Epoch [6][5600/7330] lr: 1.000e-04, eta: 8:19:45, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0405, loss_cls: 0.1827, acc: 93.3186, loss_bbox: 0.2333, loss_mask: 0.2408, loss: 0.7158 2024-06-01 06:22:20,928 - mmdet - INFO - Epoch [6][5650/7330] lr: 1.000e-04, eta: 8:19:11, time: 0.642, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0426, loss_cls: 0.1884, acc: 93.1038, loss_bbox: 0.2423, loss_mask: 0.2402, loss: 0.7366 2024-06-01 06:22:53,113 - mmdet - INFO - Epoch [6][5700/7330] lr: 1.000e-04, eta: 8:18:38, time: 0.644, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0437, loss_cls: 0.1824, acc: 93.2471, loss_bbox: 0.2366, loss_mask: 0.2393, loss: 0.7226 2024-06-01 06:23:25,272 - mmdet - INFO - Epoch [6][5750/7330] lr: 1.000e-04, eta: 8:18:04, time: 0.643, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0419, loss_cls: 0.1806, acc: 93.4465, loss_bbox: 0.2287, loss_mask: 0.2348, loss: 0.7058 2024-06-01 06:24:00,631 - mmdet - INFO - Epoch [6][5800/7330] lr: 1.000e-04, eta: 8:17:34, time: 0.707, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0454, loss_cls: 0.1950, acc: 92.7932, loss_bbox: 0.2508, loss_mask: 0.2464, loss: 0.7599 2024-06-01 06:24:32,647 - mmdet - INFO - Epoch [6][5850/7330] lr: 1.000e-04, eta: 8:17:01, time: 0.640, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0432, loss_cls: 0.1766, acc: 93.5291, loss_bbox: 0.2337, loss_mask: 0.2417, loss: 0.7150 2024-06-01 06:25:04,924 - mmdet - INFO - Epoch [6][5900/7330] lr: 1.000e-04, eta: 8:16:27, time: 0.646, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0417, loss_cls: 0.1830, acc: 93.2246, loss_bbox: 0.2360, loss_mask: 0.2402, loss: 0.7217 2024-06-01 06:25:36,599 - mmdet - INFO - Epoch [6][5950/7330] lr: 1.000e-04, eta: 8:15:53, time: 0.633, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0423, loss_cls: 0.1922, acc: 92.9797, loss_bbox: 0.2414, loss_mask: 0.2428, loss: 0.7385 2024-06-01 06:26:08,981 - mmdet - INFO - Epoch [6][6000/7330] lr: 1.000e-04, eta: 8:15:20, time: 0.648, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0433, loss_cls: 0.1852, acc: 93.1934, loss_bbox: 0.2400, loss_mask: 0.2470, loss: 0.7355 2024-06-01 06:26:41,180 - mmdet - INFO - Epoch [6][6050/7330] lr: 1.000e-04, eta: 8:14:47, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0450, loss_cls: 0.1929, acc: 92.9465, loss_bbox: 0.2467, loss_mask: 0.2476, loss: 0.7541 2024-06-01 06:27:13,593 - mmdet - INFO - Epoch [6][6100/7330] lr: 1.000e-04, eta: 8:14:13, time: 0.648, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0425, loss_cls: 0.1873, acc: 93.1777, loss_bbox: 0.2387, loss_mask: 0.2393, loss: 0.7272 2024-06-01 06:27:46,193 - mmdet - INFO - Epoch [6][6150/7330] lr: 1.000e-04, eta: 8:13:40, time: 0.651, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0440, loss_cls: 0.1889, acc: 93.0498, loss_bbox: 0.2411, loss_mask: 0.2436, loss: 0.7381 2024-06-01 06:28:18,333 - mmdet - INFO - Epoch [6][6200/7330] lr: 1.000e-04, eta: 8:13:07, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0409, loss_cls: 0.1799, acc: 93.3948, loss_bbox: 0.2262, loss_mask: 0.2303, loss: 0.6975 2024-06-01 06:28:50,548 - mmdet - INFO - Epoch [6][6250/7330] lr: 1.000e-04, eta: 8:12:34, time: 0.644, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0444, loss_cls: 0.1938, acc: 92.8625, loss_bbox: 0.2497, loss_mask: 0.2428, loss: 0.7530 2024-06-01 06:29:22,619 - mmdet - INFO - Epoch [6][6300/7330] lr: 1.000e-04, eta: 8:12:00, time: 0.641, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0403, loss_cls: 0.1785, acc: 93.5256, loss_bbox: 0.2256, loss_mask: 0.2420, loss: 0.7054 2024-06-01 06:29:55,104 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 06:29:55,104 - mmdet - INFO - Epoch [6][6350/7330] lr: 1.000e-04, eta: 8:11:27, time: 0.650, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0454, loss_cls: 0.1936, acc: 92.8933, loss_bbox: 0.2478, loss_mask: 0.2455, loss: 0.7549 2024-06-01 06:30:27,339 - mmdet - INFO - Epoch [6][6400/7330] lr: 1.000e-04, eta: 8:10:53, time: 0.644, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0443, loss_cls: 0.1935, acc: 92.8875, loss_bbox: 0.2464, loss_mask: 0.2440, loss: 0.7490 2024-06-01 06:31:01,834 - mmdet - INFO - Epoch [6][6450/7330] lr: 1.000e-04, eta: 8:10:22, time: 0.690, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0445, loss_cls: 0.1905, acc: 93.0291, loss_bbox: 0.2448, loss_mask: 0.2428, loss: 0.7437 2024-06-01 06:31:34,075 - mmdet - INFO - Epoch [6][6500/7330] lr: 1.000e-04, eta: 8:09:49, time: 0.645, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0439, loss_cls: 0.1915, acc: 93.0396, loss_bbox: 0.2415, loss_mask: 0.2421, loss: 0.7387 2024-06-01 06:32:15,794 - mmdet - INFO - Epoch [6][6550/7330] lr: 1.000e-04, eta: 8:09:26, time: 0.834, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0440, loss_cls: 0.1941, acc: 92.9543, loss_bbox: 0.2420, loss_mask: 0.2391, loss: 0.7397 2024-06-01 06:32:52,281 - mmdet - INFO - Epoch [6][6600/7330] lr: 1.000e-04, eta: 8:08:57, time: 0.730, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0409, loss_cls: 0.1852, acc: 93.1985, loss_bbox: 0.2369, loss_mask: 0.2417, loss: 0.7237 2024-06-01 06:33:24,482 - mmdet - INFO - Epoch [6][6650/7330] lr: 1.000e-04, eta: 8:08:23, time: 0.644, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0442, loss_cls: 0.1944, acc: 92.8542, loss_bbox: 0.2444, loss_mask: 0.2390, loss: 0.7436 2024-06-01 06:33:56,308 - mmdet - INFO - Epoch [6][6700/7330] lr: 1.000e-04, eta: 8:07:49, time: 0.637, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0407, loss_cls: 0.1846, acc: 93.2263, loss_bbox: 0.2376, loss_mask: 0.2399, loss: 0.7216 2024-06-01 06:34:28,331 - mmdet - INFO - Epoch [6][6750/7330] lr: 1.000e-04, eta: 8:07:16, time: 0.640, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0412, loss_cls: 0.1836, acc: 93.3218, loss_bbox: 0.2335, loss_mask: 0.2417, loss: 0.7203 2024-06-01 06:35:00,273 - mmdet - INFO - Epoch [6][6800/7330] lr: 1.000e-04, eta: 8:06:42, time: 0.639, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0421, loss_cls: 0.1923, acc: 92.9053, loss_bbox: 0.2444, loss_mask: 0.2368, loss: 0.7364 2024-06-01 06:35:35,118 - mmdet - INFO - Epoch [6][6850/7330] lr: 1.000e-04, eta: 8:06:11, time: 0.697, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0417, loss_cls: 0.1890, acc: 92.9705, loss_bbox: 0.2439, loss_mask: 0.2432, loss: 0.7367 2024-06-01 06:36:07,533 - mmdet - INFO - Epoch [6][6900/7330] lr: 1.000e-04, eta: 8:05:38, time: 0.648, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0417, loss_cls: 0.1877, acc: 93.1177, loss_bbox: 0.2372, loss_mask: 0.2347, loss: 0.7220 2024-06-01 06:36:39,631 - mmdet - INFO - Epoch [6][6950/7330] lr: 1.000e-04, eta: 8:05:05, time: 0.642, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0431, loss_cls: 0.1914, acc: 92.9280, loss_bbox: 0.2451, loss_mask: 0.2456, loss: 0.7446 2024-06-01 06:37:12,365 - mmdet - INFO - Epoch [6][7000/7330] lr: 1.000e-04, eta: 8:04:32, time: 0.655, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0454, loss_cls: 0.1938, acc: 92.9607, loss_bbox: 0.2423, loss_mask: 0.2405, loss: 0.7441 2024-06-01 06:37:44,400 - mmdet - INFO - Epoch [6][7050/7330] lr: 1.000e-04, eta: 8:03:58, time: 0.641, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0414, loss_cls: 0.1744, acc: 93.6216, loss_bbox: 0.2263, loss_mask: 0.2296, loss: 0.6917 2024-06-01 06:38:16,510 - mmdet - INFO - Epoch [6][7100/7330] lr: 1.000e-04, eta: 8:03:25, time: 0.642, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0438, loss_cls: 0.1927, acc: 93.0195, loss_bbox: 0.2395, loss_mask: 0.2446, loss: 0.7427 2024-06-01 06:38:49,018 - mmdet - INFO - Epoch [6][7150/7330] lr: 1.000e-04, eta: 8:02:51, time: 0.650, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0439, loss_cls: 0.1825, acc: 93.3799, loss_bbox: 0.2346, loss_mask: 0.2369, loss: 0.7207 2024-06-01 06:39:21,114 - mmdet - INFO - Epoch [6][7200/7330] lr: 1.000e-04, eta: 8:02:18, time: 0.642, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0441, loss_cls: 0.1980, acc: 92.6453, loss_bbox: 0.2503, loss_mask: 0.2470, loss: 0.7607 2024-06-01 06:39:53,324 - mmdet - INFO - Epoch [6][7250/7330] lr: 1.000e-04, eta: 8:01:45, time: 0.644, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0436, loss_cls: 0.1954, acc: 92.9216, loss_bbox: 0.2481, loss_mask: 0.2414, loss: 0.7505 2024-06-01 06:40:25,601 - mmdet - INFO - Epoch [6][7300/7330] lr: 1.000e-04, eta: 8:01:11, time: 0.646, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0434, loss_cls: 0.1834, acc: 93.3279, loss_bbox: 0.2361, loss_mask: 0.2397, loss: 0.7248 2024-06-01 06:40:45,688 - mmdet - INFO - Saving checkpoint at 6 epochs 2024-06-01 06:42:25,080 - mmdet - INFO - Evaluating bbox... 2024-06-01 06:42:52,100 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.448 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.687 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.488 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.263 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.488 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.620 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.567 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.567 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.567 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.366 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.615 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.737 2024-06-01 06:42:52,101 - mmdet - INFO - Evaluating segm... 2024-06-01 06:43:17,541 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.404 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.650 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.424 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.182 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.439 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.630 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.513 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.513 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.513 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.295 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.565 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.712 2024-06-01 06:43:18,090 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 06:43:18,092 - mmdet - INFO - Epoch(val) [6][625] bbox_mAP: 0.4480, bbox_mAP_50: 0.6870, bbox_mAP_75: 0.4880, bbox_mAP_s: 0.2630, bbox_mAP_m: 0.4880, bbox_mAP_l: 0.6200, bbox_mAP_copypaste: 0.448 0.687 0.488 0.263 0.488 0.620, segm_mAP: 0.4040, segm_mAP_50: 0.6500, segm_mAP_75: 0.4240, segm_mAP_s: 0.1820, segm_mAP_m: 0.4390, segm_mAP_l: 0.6300, segm_mAP_copypaste: 0.404 0.650 0.424 0.182 0.439 0.630 2024-06-01 06:43:53,727 - mmdet - INFO - Epoch [7][50/7330] lr: 1.000e-04, eta: 8:00:02, time: 0.712, data_time: 0.099, memory: 13277, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0394, loss_cls: 0.1799, acc: 93.4097, loss_bbox: 0.2308, loss_mask: 0.2315, loss: 0.6998 2024-06-01 06:44:28,490 - mmdet - INFO - Epoch [7][100/7330] lr: 1.000e-04, eta: 7:59:31, time: 0.695, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0426, loss_cls: 0.1864, acc: 93.1052, loss_bbox: 0.2440, loss_mask: 0.2414, loss: 0.7353 2024-06-01 06:45:00,515 - mmdet - INFO - Epoch [7][150/7330] lr: 1.000e-04, eta: 7:58:58, time: 0.641, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0395, loss_cls: 0.1742, acc: 93.4744, loss_bbox: 0.2278, loss_mask: 0.2353, loss: 0.6971 2024-06-01 06:45:32,554 - mmdet - INFO - Epoch [7][200/7330] lr: 1.000e-04, eta: 7:58:24, time: 0.641, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0435, loss_cls: 0.1856, acc: 93.0608, loss_bbox: 0.2466, loss_mask: 0.2429, loss: 0.7375 2024-06-01 06:46:07,492 - mmdet - INFO - Epoch [7][250/7330] lr: 1.000e-04, eta: 7:57:53, time: 0.698, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0421, loss_cls: 0.1823, acc: 93.1755, loss_bbox: 0.2429, loss_mask: 0.2429, loss: 0.7278 2024-06-01 06:46:41,990 - mmdet - INFO - Epoch [7][300/7330] lr: 1.000e-04, eta: 7:57:22, time: 0.690, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0425, loss_cls: 0.1782, acc: 93.3169, loss_bbox: 0.2356, loss_mask: 0.2329, loss: 0.7078 2024-06-01 06:47:19,726 - mmdet - INFO - Epoch [7][350/7330] lr: 1.000e-04, eta: 7:56:54, time: 0.755, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0434, loss_cls: 0.1832, acc: 93.1675, loss_bbox: 0.2388, loss_mask: 0.2325, loss: 0.7187 2024-06-01 06:47:51,706 - mmdet - INFO - Epoch [7][400/7330] lr: 1.000e-04, eta: 7:56:21, time: 0.640, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0431, loss_cls: 0.1800, acc: 93.3228, loss_bbox: 0.2364, loss_mask: 0.2363, loss: 0.7145 2024-06-01 06:48:25,821 - mmdet - INFO - Epoch [7][450/7330] lr: 1.000e-04, eta: 7:55:49, time: 0.682, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0418, loss_cls: 0.1777, acc: 93.4402, loss_bbox: 0.2317, loss_mask: 0.2345, loss: 0.7044 2024-06-01 06:49:00,388 - mmdet - INFO - Epoch [7][500/7330] lr: 1.000e-04, eta: 7:55:18, time: 0.691, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0430, loss_cls: 0.1815, acc: 93.2822, loss_bbox: 0.2346, loss_mask: 0.2371, loss: 0.7158 2024-06-01 06:49:32,144 - mmdet - INFO - Epoch [7][550/7330] lr: 1.000e-04, eta: 7:54:44, time: 0.635, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0391, loss_cls: 0.1638, acc: 93.9224, loss_bbox: 0.2140, loss_mask: 0.2331, loss: 0.6687 2024-06-01 06:50:04,213 - mmdet - INFO - Epoch [7][600/7330] lr: 1.000e-04, eta: 7:54:11, time: 0.641, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0395, loss_cls: 0.1652, acc: 93.7466, loss_bbox: 0.2243, loss_mask: 0.2336, loss: 0.6798 2024-06-01 06:50:36,485 - mmdet - INFO - Epoch [7][650/7330] lr: 1.000e-04, eta: 7:53:38, time: 0.645, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0437, loss_cls: 0.1839, acc: 93.2119, loss_bbox: 0.2371, loss_mask: 0.2369, loss: 0.7218 2024-06-01 06:51:08,495 - mmdet - INFO - Epoch [7][700/7330] lr: 1.000e-04, eta: 7:53:04, time: 0.640, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0411, loss_cls: 0.1734, acc: 93.5952, loss_bbox: 0.2276, loss_mask: 0.2311, loss: 0.6913 2024-06-01 06:51:40,924 - mmdet - INFO - Epoch [7][750/7330] lr: 1.000e-04, eta: 7:52:31, time: 0.649, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0416, loss_cls: 0.1781, acc: 93.4185, loss_bbox: 0.2344, loss_mask: 0.2393, loss: 0.7129 2024-06-01 06:52:12,764 - mmdet - INFO - Epoch [7][800/7330] lr: 1.000e-04, eta: 7:51:57, time: 0.636, data_time: 0.023, memory: 13277, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0414, loss_cls: 0.1743, acc: 93.5415, loss_bbox: 0.2299, loss_mask: 0.2358, loss: 0.6992 2024-06-01 06:52:44,688 - mmdet - INFO - Epoch [7][850/7330] lr: 1.000e-04, eta: 7:51:24, time: 0.639, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0427, loss_cls: 0.1778, acc: 93.4419, loss_bbox: 0.2344, loss_mask: 0.2357, loss: 0.7083 2024-06-01 06:53:16,813 - mmdet - INFO - Epoch [7][900/7330] lr: 1.000e-04, eta: 7:50:50, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0397, loss_cls: 0.1779, acc: 93.3289, loss_bbox: 0.2328, loss_mask: 0.2365, loss: 0.7043 2024-06-01 06:53:48,909 - mmdet - INFO - Epoch [7][950/7330] lr: 1.000e-04, eta: 7:50:17, time: 0.642, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0405, loss_cls: 0.1809, acc: 93.2917, loss_bbox: 0.2391, loss_mask: 0.2363, loss: 0.7153 2024-06-01 06:54:20,931 - mmdet - INFO - Epoch [7][1000/7330] lr: 1.000e-04, eta: 7:49:43, time: 0.640, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0414, loss_cls: 0.1742, acc: 93.5583, loss_bbox: 0.2259, loss_mask: 0.2355, loss: 0.6958 2024-06-01 06:54:53,229 - mmdet - INFO - Epoch [7][1050/7330] lr: 1.000e-04, eta: 7:49:10, time: 0.646, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0421, loss_cls: 0.1817, acc: 93.2151, loss_bbox: 0.2383, loss_mask: 0.2384, loss: 0.7201 2024-06-01 06:55:27,379 - mmdet - INFO - Epoch [7][1100/7330] lr: 1.000e-04, eta: 7:48:38, time: 0.683, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0389, loss_cls: 0.1725, acc: 93.5942, loss_bbox: 0.2294, loss_mask: 0.2349, loss: 0.6936 2024-06-01 06:56:01,930 - mmdet - INFO - Epoch [7][1150/7330] lr: 1.000e-04, eta: 7:48:07, time: 0.691, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0396, loss_cls: 0.1790, acc: 93.3276, loss_bbox: 0.2322, loss_mask: 0.2367, loss: 0.7053 2024-06-01 06:56:34,584 - mmdet - INFO - Epoch [7][1200/7330] lr: 1.000e-04, eta: 7:47:34, time: 0.653, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0416, loss_cls: 0.1764, acc: 93.4421, loss_bbox: 0.2328, loss_mask: 0.2311, loss: 0.7015 2024-06-01 06:57:07,002 - mmdet - INFO - Epoch [7][1250/7330] lr: 1.000e-04, eta: 7:47:01, time: 0.648, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0435, loss_cls: 0.1838, acc: 93.2769, loss_bbox: 0.2409, loss_mask: 0.2360, loss: 0.7228 2024-06-01 06:57:39,066 - mmdet - INFO - Epoch [7][1300/7330] lr: 1.000e-04, eta: 7:46:28, time: 0.642, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0399, loss_cls: 0.1741, acc: 93.6433, loss_bbox: 0.2198, loss_mask: 0.2319, loss: 0.6830 2024-06-01 06:58:16,170 - mmdet - INFO - Epoch [7][1350/7330] lr: 1.000e-04, eta: 7:45:59, time: 0.742, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0376, loss_cls: 0.1707, acc: 93.6821, loss_bbox: 0.2297, loss_mask: 0.2399, loss: 0.6977 2024-06-01 06:58:54,137 - mmdet - INFO - Epoch [7][1400/7330] lr: 1.000e-04, eta: 7:45:31, time: 0.759, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0426, loss_cls: 0.1874, acc: 93.0278, loss_bbox: 0.2432, loss_mask: 0.2350, loss: 0.7268 2024-06-01 06:59:26,493 - mmdet - INFO - Epoch [7][1450/7330] lr: 1.000e-04, eta: 7:44:58, time: 0.647, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0420, loss_cls: 0.1806, acc: 93.1853, loss_bbox: 0.2379, loss_mask: 0.2368, loss: 0.7164 2024-06-01 07:00:01,123 - mmdet - INFO - Epoch [7][1500/7330] lr: 1.000e-04, eta: 7:44:27, time: 0.692, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0438, loss_cls: 0.1816, acc: 93.2639, loss_bbox: 0.2380, loss_mask: 0.2406, loss: 0.7235 2024-06-01 07:00:35,119 - mmdet - INFO - Epoch [7][1550/7330] lr: 1.000e-04, eta: 7:43:55, time: 0.680, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0390, loss_cls: 0.1732, acc: 93.7090, loss_bbox: 0.2215, loss_mask: 0.2312, loss: 0.6822 2024-06-01 07:01:07,138 - mmdet - INFO - Epoch [7][1600/7330] lr: 1.000e-04, eta: 7:43:22, time: 0.640, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0393, loss_cls: 0.1712, acc: 93.6812, loss_bbox: 0.2288, loss_mask: 0.2372, loss: 0.6952 2024-06-01 07:01:39,256 - mmdet - INFO - Epoch [7][1650/7330] lr: 1.000e-04, eta: 7:42:48, time: 0.642, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0428, loss_cls: 0.1790, acc: 93.2786, loss_bbox: 0.2365, loss_mask: 0.2387, loss: 0.7168 2024-06-01 07:02:11,366 - mmdet - INFO - Epoch [7][1700/7330] lr: 1.000e-04, eta: 7:42:15, time: 0.642, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0403, loss_cls: 0.1784, acc: 93.3579, loss_bbox: 0.2315, loss_mask: 0.2354, loss: 0.7026 2024-06-01 07:02:43,224 - mmdet - INFO - Epoch [7][1750/7330] lr: 1.000e-04, eta: 7:41:41, time: 0.637, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0424, loss_cls: 0.1785, acc: 93.3533, loss_bbox: 0.2374, loss_mask: 0.2387, loss: 0.7155 2024-06-01 07:03:15,396 - mmdet - INFO - Epoch [7][1800/7330] lr: 1.000e-04, eta: 7:41:08, time: 0.644, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0448, loss_cls: 0.1795, acc: 93.3181, loss_bbox: 0.2369, loss_mask: 0.2357, loss: 0.7166 2024-06-01 07:03:47,625 - mmdet - INFO - Epoch [7][1850/7330] lr: 1.000e-04, eta: 7:40:34, time: 0.645, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0418, loss_cls: 0.1815, acc: 93.1709, loss_bbox: 0.2405, loss_mask: 0.2344, loss: 0.7171 2024-06-01 07:04:20,069 - mmdet - INFO - Epoch [7][1900/7330] lr: 1.000e-04, eta: 7:40:01, time: 0.649, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0412, loss_cls: 0.1885, acc: 92.9414, loss_bbox: 0.2462, loss_mask: 0.2379, loss: 0.7346 2024-06-01 07:04:52,164 - mmdet - INFO - Epoch [7][1950/7330] lr: 1.000e-04, eta: 7:39:28, time: 0.642, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0432, loss_cls: 0.1809, acc: 93.2144, loss_bbox: 0.2352, loss_mask: 0.2323, loss: 0.7115 2024-06-01 07:05:24,078 - mmdet - INFO - Epoch [7][2000/7330] lr: 1.000e-04, eta: 7:38:54, time: 0.638, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0421, loss_cls: 0.1807, acc: 93.3438, loss_bbox: 0.2312, loss_mask: 0.2354, loss: 0.7083 2024-06-01 07:05:56,168 - mmdet - INFO - Epoch [7][2050/7330] lr: 1.000e-04, eta: 7:38:21, time: 0.642, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0425, loss_cls: 0.1759, acc: 93.4714, loss_bbox: 0.2329, loss_mask: 0.2388, loss: 0.7081 2024-06-01 07:06:28,279 - mmdet - INFO - Epoch [7][2100/7330] lr: 1.000e-04, eta: 7:37:47, time: 0.642, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0386, loss_cls: 0.1727, acc: 93.6123, loss_bbox: 0.2221, loss_mask: 0.2320, loss: 0.6833 2024-06-01 07:07:02,554 - mmdet - INFO - Epoch [7][2150/7330] lr: 1.000e-04, eta: 7:37:16, time: 0.685, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0423, loss_cls: 0.1770, acc: 93.5522, loss_bbox: 0.2280, loss_mask: 0.2322, loss: 0.6970 2024-06-01 07:07:38,530 - mmdet - INFO - Epoch [7][2200/7330] lr: 1.000e-04, eta: 7:36:46, time: 0.720, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0443, loss_cls: 0.1873, acc: 93.0933, loss_bbox: 0.2403, loss_mask: 0.2363, loss: 0.7290 2024-06-01 07:08:10,782 - mmdet - INFO - Epoch [7][2250/7330] lr: 1.000e-04, eta: 7:36:13, time: 0.645, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0421, loss_cls: 0.1719, acc: 93.6506, loss_bbox: 0.2306, loss_mask: 0.2306, loss: 0.6954 2024-06-01 07:08:43,358 - mmdet - INFO - Epoch [7][2300/7330] lr: 1.000e-04, eta: 7:35:40, time: 0.652, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0452, loss_cls: 0.1892, acc: 92.9175, loss_bbox: 0.2459, loss_mask: 0.2393, loss: 0.7399 2024-06-01 07:09:15,607 - mmdet - INFO - Epoch [7][2350/7330] lr: 1.000e-04, eta: 7:35:06, time: 0.645, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0418, loss_cls: 0.1853, acc: 93.1316, loss_bbox: 0.2399, loss_mask: 0.2315, loss: 0.7166 2024-06-01 07:09:52,620 - mmdet - INFO - Epoch [7][2400/7330] lr: 1.000e-04, eta: 7:34:37, time: 0.740, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0402, loss_cls: 0.1773, acc: 93.4265, loss_bbox: 0.2320, loss_mask: 0.2345, loss: 0.7012 2024-06-01 07:10:30,810 - mmdet - INFO - Epoch [7][2450/7330] lr: 1.000e-04, eta: 7:34:09, time: 0.764, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0434, loss_cls: 0.1871, acc: 93.0254, loss_bbox: 0.2464, loss_mask: 0.2366, loss: 0.7330 2024-06-01 07:11:02,903 - mmdet - INFO - Epoch [7][2500/7330] lr: 1.000e-04, eta: 7:33:36, time: 0.642, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0405, loss_cls: 0.1745, acc: 93.5891, loss_bbox: 0.2262, loss_mask: 0.2327, loss: 0.6924 2024-06-01 07:11:38,439 - mmdet - INFO - Epoch [7][2550/7330] lr: 1.000e-04, eta: 7:33:06, time: 0.711, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0390, loss_cls: 0.1712, acc: 93.5791, loss_bbox: 0.2245, loss_mask: 0.2281, loss: 0.6814 2024-06-01 07:12:12,710 - mmdet - INFO - Epoch [7][2600/7330] lr: 1.000e-04, eta: 7:32:34, time: 0.685, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0445, loss_cls: 0.1919, acc: 92.8171, loss_bbox: 0.2485, loss_mask: 0.2400, loss: 0.7430 2024-06-01 07:12:44,855 - mmdet - INFO - Epoch [7][2650/7330] lr: 1.000e-04, eta: 7:32:01, time: 0.643, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0407, loss_cls: 0.1774, acc: 93.5105, loss_bbox: 0.2258, loss_mask: 0.2262, loss: 0.6887 2024-06-01 07:13:16,857 - mmdet - INFO - Epoch [7][2700/7330] lr: 1.000e-04, eta: 7:31:27, time: 0.640, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0418, loss_cls: 0.1783, acc: 93.4309, loss_bbox: 0.2338, loss_mask: 0.2333, loss: 0.7064 2024-06-01 07:13:48,856 - mmdet - INFO - Epoch [7][2750/7330] lr: 1.000e-04, eta: 7:30:54, time: 0.640, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0428, loss_cls: 0.1810, acc: 93.3479, loss_bbox: 0.2314, loss_mask: 0.2311, loss: 0.7052 2024-06-01 07:14:20,905 - mmdet - INFO - Epoch [7][2800/7330] lr: 1.000e-04, eta: 7:30:20, time: 0.641, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0415, loss_cls: 0.1731, acc: 93.5999, loss_bbox: 0.2226, loss_mask: 0.2311, loss: 0.6855 2024-06-01 07:14:53,306 - mmdet - INFO - Epoch [7][2850/7330] lr: 1.000e-04, eta: 7:29:47, time: 0.648, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0404, loss_cls: 0.1789, acc: 93.5398, loss_bbox: 0.2263, loss_mask: 0.2292, loss: 0.6940 2024-06-01 07:15:25,183 - mmdet - INFO - Epoch [7][2900/7330] lr: 1.000e-04, eta: 7:29:13, time: 0.637, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0400, loss_cls: 0.1775, acc: 93.4661, loss_bbox: 0.2297, loss_mask: 0.2319, loss: 0.6970 2024-06-01 07:15:57,350 - mmdet - INFO - Epoch [7][2950/7330] lr: 1.000e-04, eta: 7:28:40, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0415, loss_cls: 0.1739, acc: 93.6233, loss_bbox: 0.2293, loss_mask: 0.2361, loss: 0.6993 2024-06-01 07:16:29,586 - mmdet - INFO - Epoch [7][3000/7330] lr: 1.000e-04, eta: 7:28:07, time: 0.645, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0430, loss_cls: 0.1795, acc: 93.3003, loss_bbox: 0.2372, loss_mask: 0.2386, loss: 0.7180 2024-06-01 07:17:01,764 - mmdet - INFO - Epoch [7][3050/7330] lr: 1.000e-04, eta: 7:27:33, time: 0.644, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0405, loss_cls: 0.1823, acc: 93.2354, loss_bbox: 0.2366, loss_mask: 0.2389, loss: 0.7178 2024-06-01 07:17:33,660 - mmdet - INFO - Epoch [7][3100/7330] lr: 1.000e-04, eta: 7:27:00, time: 0.638, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0430, loss_cls: 0.1825, acc: 93.1804, loss_bbox: 0.2376, loss_mask: 0.2395, loss: 0.7206 2024-06-01 07:18:05,423 - mmdet - INFO - Epoch [7][3150/7330] lr: 1.000e-04, eta: 7:26:26, time: 0.635, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0384, loss_cls: 0.1669, acc: 93.8098, loss_bbox: 0.2185, loss_mask: 0.2330, loss: 0.6749 2024-06-01 07:18:39,889 - mmdet - INFO - Epoch [7][3200/7330] lr: 1.000e-04, eta: 7:25:55, time: 0.689, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0407, loss_cls: 0.1803, acc: 93.3723, loss_bbox: 0.2328, loss_mask: 0.2321, loss: 0.7054 2024-06-01 07:19:15,299 - mmdet - INFO - Epoch [7][3250/7330] lr: 1.000e-04, eta: 7:25:24, time: 0.708, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0427, loss_cls: 0.1830, acc: 93.1584, loss_bbox: 0.2361, loss_mask: 0.2369, loss: 0.7180 2024-06-01 07:19:47,392 - mmdet - INFO - Epoch [7][3300/7330] lr: 1.000e-04, eta: 7:24:51, time: 0.642, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0436, loss_cls: 0.1808, acc: 93.3318, loss_bbox: 0.2360, loss_mask: 0.2335, loss: 0.7124 2024-06-01 07:20:19,512 - mmdet - INFO - Epoch [7][3350/7330] lr: 1.000e-04, eta: 7:24:17, time: 0.642, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0416, loss_cls: 0.1811, acc: 93.2500, loss_bbox: 0.2368, loss_mask: 0.2412, loss: 0.7203 2024-06-01 07:20:51,979 - mmdet - INFO - Epoch [7][3400/7330] lr: 1.000e-04, eta: 7:23:44, time: 0.649, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0432, loss_cls: 0.1812, acc: 93.3225, loss_bbox: 0.2333, loss_mask: 0.2367, loss: 0.7135 2024-06-01 07:21:28,884 - mmdet - INFO - Epoch [7][3450/7330] lr: 1.000e-04, eta: 7:23:15, time: 0.738, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0410, loss_cls: 0.1763, acc: 93.5044, loss_bbox: 0.2280, loss_mask: 0.2348, loss: 0.7003 2024-06-01 07:22:03,646 - mmdet - INFO - Epoch [7][3500/7330] lr: 1.000e-04, eta: 7:22:44, time: 0.695, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0420, loss_cls: 0.1816, acc: 93.4006, loss_bbox: 0.2326, loss_mask: 0.2363, loss: 0.7120 2024-06-01 07:22:38,001 - mmdet - INFO - Epoch [7][3550/7330] lr: 1.000e-04, eta: 7:22:12, time: 0.687, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0415, loss_cls: 0.1770, acc: 93.5322, loss_bbox: 0.2332, loss_mask: 0.2369, loss: 0.7066 2024-06-01 07:23:14,078 - mmdet - INFO - Epoch [7][3600/7330] lr: 1.000e-04, eta: 7:21:42, time: 0.721, data_time: 0.023, memory: 13277, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0434, loss_cls: 0.1840, acc: 93.2119, loss_bbox: 0.2417, loss_mask: 0.2431, loss: 0.7320 2024-06-01 07:23:48,312 - mmdet - INFO - Epoch [7][3650/7330] lr: 1.000e-04, eta: 7:21:11, time: 0.685, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0403, loss_cls: 0.1724, acc: 93.6165, loss_bbox: 0.2189, loss_mask: 0.2367, loss: 0.6862 2024-06-01 07:24:20,210 - mmdet - INFO - Epoch [7][3700/7330] lr: 1.000e-04, eta: 7:20:37, time: 0.638, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0401, loss_cls: 0.1788, acc: 93.4097, loss_bbox: 0.2301, loss_mask: 0.2320, loss: 0.6992 2024-06-01 07:24:52,280 - mmdet - INFO - Epoch [7][3750/7330] lr: 1.000e-04, eta: 7:20:04, time: 0.641, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0429, loss_cls: 0.1795, acc: 93.3552, loss_bbox: 0.2303, loss_mask: 0.2342, loss: 0.7054 2024-06-01 07:25:24,647 - mmdet - INFO - Epoch [7][3800/7330] lr: 1.000e-04, eta: 7:19:30, time: 0.647, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0415, loss_cls: 0.1812, acc: 93.3660, loss_bbox: 0.2289, loss_mask: 0.2319, loss: 0.7033 2024-06-01 07:25:56,547 - mmdet - INFO - Epoch [7][3850/7330] lr: 1.000e-04, eta: 7:18:57, time: 0.638, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0415, loss_cls: 0.1778, acc: 93.5249, loss_bbox: 0.2324, loss_mask: 0.2383, loss: 0.7080 2024-06-01 07:26:28,606 - mmdet - INFO - Epoch [7][3900/7330] lr: 1.000e-04, eta: 7:18:23, time: 0.641, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0455, loss_cls: 0.1915, acc: 92.9631, loss_bbox: 0.2429, loss_mask: 0.2455, loss: 0.7474 2024-06-01 07:27:00,642 - mmdet - INFO - Epoch [7][3950/7330] lr: 1.000e-04, eta: 7:17:50, time: 0.641, data_time: 0.023, memory: 13277, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0396, loss_cls: 0.1771, acc: 93.5024, loss_bbox: 0.2278, loss_mask: 0.2340, loss: 0.6959 2024-06-01 07:27:32,778 - mmdet - INFO - Epoch [7][4000/7330] lr: 1.000e-04, eta: 7:17:16, time: 0.642, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0403, loss_cls: 0.1763, acc: 93.4763, loss_bbox: 0.2275, loss_mask: 0.2309, loss: 0.6921 2024-06-01 07:28:04,684 - mmdet - INFO - Epoch [7][4050/7330] lr: 1.000e-04, eta: 7:16:43, time: 0.639, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0422, loss_cls: 0.1802, acc: 93.3579, loss_bbox: 0.2319, loss_mask: 0.2362, loss: 0.7100 2024-06-01 07:28:37,101 - mmdet - INFO - Epoch [7][4100/7330] lr: 1.000e-04, eta: 7:16:10, time: 0.648, data_time: 0.021, memory: 13277, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0432, loss_cls: 0.1909, acc: 93.0144, loss_bbox: 0.2419, loss_mask: 0.2359, loss: 0.7326 2024-06-01 07:29:09,426 - mmdet - INFO - Epoch [7][4150/7330] lr: 1.000e-04, eta: 7:15:37, time: 0.646, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0418, loss_cls: 0.1793, acc: 93.3501, loss_bbox: 0.2320, loss_mask: 0.2364, loss: 0.7085 2024-06-01 07:29:41,447 - mmdet - INFO - Epoch [7][4200/7330] lr: 1.000e-04, eta: 7:15:03, time: 0.641, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0414, loss_cls: 0.1782, acc: 93.3987, loss_bbox: 0.2292, loss_mask: 0.2424, loss: 0.7103 2024-06-01 07:30:16,065 - mmdet - INFO - Epoch [7][4250/7330] lr: 1.000e-04, eta: 7:14:32, time: 0.692, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0415, loss_cls: 0.1806, acc: 93.4011, loss_bbox: 0.2285, loss_mask: 0.2325, loss: 0.7027 2024-06-01 07:30:50,664 - mmdet - INFO - Epoch [7][4300/7330] lr: 1.000e-04, eta: 7:14:00, time: 0.692, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0420, loss_cls: 0.1796, acc: 93.3347, loss_bbox: 0.2288, loss_mask: 0.2317, loss: 0.6998 2024-06-01 07:31:23,193 - mmdet - INFO - Epoch [7][4350/7330] lr: 1.000e-04, eta: 7:13:27, time: 0.651, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0453, loss_cls: 0.1872, acc: 93.1257, loss_bbox: 0.2420, loss_mask: 0.2378, loss: 0.7323 2024-06-01 07:31:55,097 - mmdet - INFO - Epoch [7][4400/7330] lr: 1.000e-04, eta: 7:12:54, time: 0.638, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0412, loss_cls: 0.1789, acc: 93.4490, loss_bbox: 0.2317, loss_mask: 0.2316, loss: 0.7025 2024-06-01 07:32:27,226 - mmdet - INFO - Epoch [7][4450/7330] lr: 1.000e-04, eta: 7:12:20, time: 0.643, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0403, loss_cls: 0.1753, acc: 93.5728, loss_bbox: 0.2314, loss_mask: 0.2386, loss: 0.7040 2024-06-01 07:33:03,611 - mmdet - INFO - Epoch [7][4500/7330] lr: 1.000e-04, eta: 7:11:51, time: 0.728, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0422, loss_cls: 0.1842, acc: 93.1453, loss_bbox: 0.2365, loss_mask: 0.2381, loss: 0.7217 2024-06-01 07:33:38,209 - mmdet - INFO - Epoch [7][4550/7330] lr: 1.000e-04, eta: 7:11:19, time: 0.692, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0390, loss_cls: 0.1730, acc: 93.5642, loss_bbox: 0.2309, loss_mask: 0.2346, loss: 0.6950 2024-06-01 07:34:12,783 - mmdet - INFO - Epoch [7][4600/7330] lr: 1.000e-04, eta: 7:10:48, time: 0.692, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0417, loss_cls: 0.1852, acc: 93.2102, loss_bbox: 0.2345, loss_mask: 0.2368, loss: 0.7179 2024-06-01 07:34:47,319 - mmdet - INFO - Epoch [7][4650/7330] lr: 1.000e-04, eta: 7:10:16, time: 0.691, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0394, loss_cls: 0.1773, acc: 93.3879, loss_bbox: 0.2297, loss_mask: 0.2313, loss: 0.6970 2024-06-01 07:35:21,530 - mmdet - INFO - Epoch [7][4700/7330] lr: 1.000e-04, eta: 7:09:45, time: 0.684, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0379, loss_cls: 0.1723, acc: 93.6694, loss_bbox: 0.2264, loss_mask: 0.2303, loss: 0.6843 2024-06-01 07:35:53,891 - mmdet - INFO - Epoch [7][4750/7330] lr: 1.000e-04, eta: 7:09:11, time: 0.647, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0433, loss_cls: 0.1798, acc: 93.2927, loss_bbox: 0.2296, loss_mask: 0.2368, loss: 0.7076 2024-06-01 07:36:26,339 - mmdet - INFO - Epoch [7][4800/7330] lr: 1.000e-04, eta: 7:08:38, time: 0.649, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0435, loss_cls: 0.1882, acc: 92.9817, loss_bbox: 0.2436, loss_mask: 0.2395, loss: 0.7347 2024-06-01 07:36:58,790 - mmdet - INFO - Epoch [7][4850/7330] lr: 1.000e-04, eta: 7:08:05, time: 0.649, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0420, loss_cls: 0.1825, acc: 93.2800, loss_bbox: 0.2363, loss_mask: 0.2336, loss: 0.7136 2024-06-01 07:37:31,024 - mmdet - INFO - Epoch [7][4900/7330] lr: 1.000e-04, eta: 7:07:32, time: 0.645, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0409, loss_cls: 0.1791, acc: 93.4775, loss_bbox: 0.2315, loss_mask: 0.2341, loss: 0.7051 2024-06-01 07:38:03,306 - mmdet - INFO - Epoch [7][4950/7330] lr: 1.000e-04, eta: 7:06:59, time: 0.646, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0423, loss_cls: 0.1811, acc: 93.3965, loss_bbox: 0.2297, loss_mask: 0.2318, loss: 0.7047 2024-06-01 07:38:35,319 - mmdet - INFO - Epoch [7][5000/7330] lr: 1.000e-04, eta: 7:06:25, time: 0.640, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0431, loss_cls: 0.1814, acc: 93.2100, loss_bbox: 0.2375, loss_mask: 0.2381, loss: 0.7212 2024-06-01 07:39:07,427 - mmdet - INFO - Epoch [7][5050/7330] lr: 1.000e-04, eta: 7:05:52, time: 0.642, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0403, loss_cls: 0.1748, acc: 93.5417, loss_bbox: 0.2307, loss_mask: 0.2311, loss: 0.6950 2024-06-01 07:39:40,082 - mmdet - INFO - Epoch [7][5100/7330] lr: 1.000e-04, eta: 7:05:19, time: 0.653, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0427, loss_cls: 0.1815, acc: 93.3445, loss_bbox: 0.2327, loss_mask: 0.2371, loss: 0.7145 2024-06-01 07:40:12,213 - mmdet - INFO - Epoch [7][5150/7330] lr: 1.000e-04, eta: 7:04:46, time: 0.643, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0390, loss_cls: 0.1752, acc: 93.4785, loss_bbox: 0.2271, loss_mask: 0.2316, loss: 0.6915 2024-06-01 07:40:44,306 - mmdet - INFO - Epoch [7][5200/7330] lr: 1.000e-04, eta: 7:04:12, time: 0.642, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0414, loss_cls: 0.1740, acc: 93.5215, loss_bbox: 0.2310, loss_mask: 0.2379, loss: 0.7020 2024-06-01 07:41:16,522 - mmdet - INFO - Epoch [7][5250/7330] lr: 1.000e-04, eta: 7:03:39, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0427, loss_cls: 0.1818, acc: 93.1555, loss_bbox: 0.2376, loss_mask: 0.2398, loss: 0.7211 2024-06-01 07:41:51,187 - mmdet - INFO - Epoch [7][5300/7330] lr: 1.000e-04, eta: 7:03:07, time: 0.693, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0458, loss_cls: 0.1904, acc: 93.0215, loss_bbox: 0.2390, loss_mask: 0.2385, loss: 0.7364 2024-06-01 07:42:25,755 - mmdet - INFO - Epoch [7][5350/7330] lr: 1.000e-04, eta: 7:02:36, time: 0.691, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0391, loss_cls: 0.1753, acc: 93.5312, loss_bbox: 0.2233, loss_mask: 0.2332, loss: 0.6892 2024-06-01 07:42:57,644 - mmdet - INFO - Epoch [7][5400/7330] lr: 1.000e-04, eta: 7:02:02, time: 0.637, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0402, loss_cls: 0.1725, acc: 93.5876, loss_bbox: 0.2295, loss_mask: 0.2379, loss: 0.6986 2024-06-01 07:43:30,296 - mmdet - INFO - Epoch [7][5450/7330] lr: 1.000e-04, eta: 7:01:30, time: 0.653, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0423, loss_cls: 0.1823, acc: 93.2334, loss_bbox: 0.2374, loss_mask: 0.2397, loss: 0.7218 2024-06-01 07:44:02,980 - mmdet - INFO - Epoch [7][5500/7330] lr: 1.000e-04, eta: 7:00:57, time: 0.654, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0444, loss_cls: 0.1802, acc: 93.2927, loss_bbox: 0.2359, loss_mask: 0.2398, loss: 0.7213 2024-06-01 07:44:39,954 - mmdet - INFO - Epoch [7][5550/7330] lr: 1.000e-04, eta: 7:00:27, time: 0.739, data_time: 0.023, memory: 13277, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0422, loss_cls: 0.1795, acc: 93.3445, loss_bbox: 0.2301, loss_mask: 0.2368, loss: 0.7096 2024-06-01 07:45:12,335 - mmdet - INFO - Epoch [7][5600/7330] lr: 1.000e-04, eta: 6:59:54, time: 0.648, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0420, loss_cls: 0.1811, acc: 93.2996, loss_bbox: 0.2363, loss_mask: 0.2348, loss: 0.7137 2024-06-01 07:45:49,315 - mmdet - INFO - Epoch [7][5650/7330] lr: 1.000e-04, eta: 6:59:24, time: 0.740, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0401, loss_cls: 0.1705, acc: 93.5776, loss_bbox: 0.2291, loss_mask: 0.2407, loss: 0.6997 2024-06-01 07:46:23,587 - mmdet - INFO - Epoch [7][5700/7330] lr: 1.000e-04, eta: 6:58:53, time: 0.685, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0406, loss_cls: 0.1768, acc: 93.5198, loss_bbox: 0.2281, loss_mask: 0.2350, loss: 0.6986 2024-06-01 07:46:57,923 - mmdet - INFO - Epoch [7][5750/7330] lr: 1.000e-04, eta: 6:58:21, time: 0.687, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0418, loss_cls: 0.1814, acc: 93.3960, loss_bbox: 0.2337, loss_mask: 0.2376, loss: 0.7127 2024-06-01 07:47:30,065 - mmdet - INFO - Epoch [7][5800/7330] lr: 1.000e-04, eta: 6:57:47, time: 0.643, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0435, loss_cls: 0.1884, acc: 92.9771, loss_bbox: 0.2432, loss_mask: 0.2385, loss: 0.7351 2024-06-01 07:48:02,394 - mmdet - INFO - Epoch [7][5850/7330] lr: 1.000e-04, eta: 6:57:14, time: 0.647, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0392, loss_cls: 0.1812, acc: 93.2571, loss_bbox: 0.2332, loss_mask: 0.2343, loss: 0.7067 2024-06-01 07:48:34,455 - mmdet - INFO - Epoch [7][5900/7330] lr: 1.000e-04, eta: 6:56:41, time: 0.641, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0413, loss_cls: 0.1841, acc: 93.1680, loss_bbox: 0.2381, loss_mask: 0.2394, loss: 0.7226 2024-06-01 07:49:06,407 - mmdet - INFO - Epoch [7][5950/7330] lr: 1.000e-04, eta: 6:56:07, time: 0.639, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0421, loss_cls: 0.1821, acc: 93.3594, loss_bbox: 0.2316, loss_mask: 0.2389, loss: 0.7136 2024-06-01 07:49:38,752 - mmdet - INFO - Epoch [7][6000/7330] lr: 1.000e-04, eta: 6:55:34, time: 0.647, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0428, loss_cls: 0.1815, acc: 93.3132, loss_bbox: 0.2316, loss_mask: 0.2386, loss: 0.7159 2024-06-01 07:50:11,487 - mmdet - INFO - Epoch [7][6050/7330] lr: 1.000e-04, eta: 6:55:01, time: 0.655, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0437, loss_cls: 0.1866, acc: 93.0703, loss_bbox: 0.2434, loss_mask: 0.2421, loss: 0.7358 2024-06-01 07:50:43,498 - mmdet - INFO - Epoch [7][6100/7330] lr: 1.000e-04, eta: 6:54:28, time: 0.640, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0421, loss_cls: 0.1793, acc: 93.3911, loss_bbox: 0.2315, loss_mask: 0.2355, loss: 0.7076 2024-06-01 07:51:15,519 - mmdet - INFO - Epoch [7][6150/7330] lr: 1.000e-04, eta: 6:53:54, time: 0.640, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0410, loss_cls: 0.1836, acc: 93.2759, loss_bbox: 0.2379, loss_mask: 0.2377, loss: 0.7191 2024-06-01 07:51:47,659 - mmdet - INFO - Epoch [7][6200/7330] lr: 1.000e-04, eta: 6:53:21, time: 0.643, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0433, loss_cls: 0.1903, acc: 92.9500, loss_bbox: 0.2466, loss_mask: 0.2405, loss: 0.7396 2024-06-01 07:52:19,814 - mmdet - INFO - Epoch [7][6250/7330] lr: 1.000e-04, eta: 6:52:48, time: 0.643, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0422, loss_cls: 0.1823, acc: 93.3162, loss_bbox: 0.2313, loss_mask: 0.2309, loss: 0.7072 2024-06-01 07:52:51,969 - mmdet - INFO - Epoch [7][6300/7330] lr: 1.000e-04, eta: 6:52:14, time: 0.643, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0447, loss_cls: 0.1804, acc: 93.2571, loss_bbox: 0.2394, loss_mask: 0.2364, loss: 0.7220 2024-06-01 07:53:26,321 - mmdet - INFO - Epoch [7][6350/7330] lr: 1.000e-04, eta: 6:51:43, time: 0.687, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0401, loss_cls: 0.1757, acc: 93.4482, loss_bbox: 0.2339, loss_mask: 0.2314, loss: 0.6993 2024-06-01 07:54:02,642 - mmdet - INFO - Epoch [7][6400/7330] lr: 1.000e-04, eta: 6:51:13, time: 0.726, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0417, loss_cls: 0.1830, acc: 93.1799, loss_bbox: 0.2320, loss_mask: 0.2375, loss: 0.7131 2024-06-01 07:54:35,035 - mmdet - INFO - Epoch [7][6450/7330] lr: 1.000e-04, eta: 6:50:39, time: 0.648, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0418, loss_cls: 0.1814, acc: 93.2700, loss_bbox: 0.2369, loss_mask: 0.2399, loss: 0.7200 2024-06-01 07:55:07,479 - mmdet - INFO - Epoch [7][6500/7330] lr: 1.000e-04, eta: 6:50:06, time: 0.649, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0413, loss_cls: 0.1823, acc: 93.3267, loss_bbox: 0.2312, loss_mask: 0.2380, loss: 0.7124 2024-06-01 07:55:39,574 - mmdet - INFO - Epoch [7][6550/7330] lr: 1.000e-04, eta: 6:49:33, time: 0.642, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0421, loss_cls: 0.1811, acc: 93.2632, loss_bbox: 0.2350, loss_mask: 0.2372, loss: 0.7156 2024-06-01 07:56:16,653 - mmdet - INFO - Epoch [7][6600/7330] lr: 1.000e-04, eta: 6:49:03, time: 0.741, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0414, loss_cls: 0.1733, acc: 93.5256, loss_bbox: 0.2295, loss_mask: 0.2324, loss: 0.6948 2024-06-01 07:56:48,527 - mmdet - INFO - Epoch [7][6650/7330] lr: 1.000e-04, eta: 6:48:30, time: 0.638, data_time: 0.021, memory: 13277, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0401, loss_cls: 0.1703, acc: 93.8098, loss_bbox: 0.2182, loss_mask: 0.2311, loss: 0.6793 2024-06-01 07:57:25,947 - mmdet - INFO - Epoch [7][6700/7330] lr: 1.000e-04, eta: 6:48:00, time: 0.748, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0404, loss_cls: 0.1735, acc: 93.6433, loss_bbox: 0.2228, loss_mask: 0.2298, loss: 0.6860 2024-06-01 07:58:00,712 - mmdet - INFO - Epoch [7][6750/7330] lr: 1.000e-04, eta: 6:47:29, time: 0.695, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0444, loss_cls: 0.1830, acc: 93.2358, loss_bbox: 0.2355, loss_mask: 0.2398, loss: 0.7239 2024-06-01 07:58:35,275 - mmdet - INFO - Epoch [7][6800/7330] lr: 1.000e-04, eta: 6:46:57, time: 0.691, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0439, loss_cls: 0.1857, acc: 93.0845, loss_bbox: 0.2442, loss_mask: 0.2376, loss: 0.7318 2024-06-01 07:59:07,478 - mmdet - INFO - Epoch [7][6850/7330] lr: 1.000e-04, eta: 6:46:24, time: 0.644, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0415, loss_cls: 0.1790, acc: 93.3538, loss_bbox: 0.2326, loss_mask: 0.2337, loss: 0.7052 2024-06-01 07:59:39,552 - mmdet - INFO - Epoch [7][6900/7330] lr: 1.000e-04, eta: 6:45:51, time: 0.641, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0407, loss_cls: 0.1839, acc: 93.2539, loss_bbox: 0.2385, loss_mask: 0.2342, loss: 0.7171 2024-06-01 08:00:11,519 - mmdet - INFO - Epoch [7][6950/7330] lr: 1.000e-04, eta: 6:45:17, time: 0.639, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0404, loss_cls: 0.1781, acc: 93.3987, loss_bbox: 0.2312, loss_mask: 0.2363, loss: 0.7050 2024-06-01 08:00:43,635 - mmdet - INFO - Epoch [7][7000/7330] lr: 1.000e-04, eta: 6:44:44, time: 0.642, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0409, loss_cls: 0.1754, acc: 93.4648, loss_bbox: 0.2286, loss_mask: 0.2394, loss: 0.7042 2024-06-01 08:01:15,687 - mmdet - INFO - Epoch [7][7050/7330] lr: 1.000e-04, eta: 6:44:10, time: 0.641, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0426, loss_cls: 0.1757, acc: 93.5015, loss_bbox: 0.2215, loss_mask: 0.2354, loss: 0.6940 2024-06-01 08:01:47,987 - mmdet - INFO - Epoch [7][7100/7330] lr: 1.000e-04, eta: 6:43:37, time: 0.646, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0428, loss_cls: 0.1872, acc: 93.0601, loss_bbox: 0.2405, loss_mask: 0.2457, loss: 0.7345 2024-06-01 08:02:19,600 - mmdet - INFO - Epoch [7][7150/7330] lr: 1.000e-04, eta: 6:43:03, time: 0.632, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0390, loss_cls: 0.1676, acc: 93.8186, loss_bbox: 0.2197, loss_mask: 0.2294, loss: 0.6728 2024-06-01 08:02:52,022 - mmdet - INFO - Epoch [7][7200/7330] lr: 1.000e-04, eta: 6:42:30, time: 0.648, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0399, loss_cls: 0.1715, acc: 93.6145, loss_bbox: 0.2217, loss_mask: 0.2284, loss: 0.6806 2024-06-01 08:03:23,944 - mmdet - INFO - Epoch [7][7250/7330] lr: 1.000e-04, eta: 6:41:57, time: 0.638, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0407, loss_cls: 0.1796, acc: 93.3821, loss_bbox: 0.2306, loss_mask: 0.2355, loss: 0.7063 2024-06-01 08:03:55,911 - mmdet - INFO - Epoch [7][7300/7330] lr: 1.000e-04, eta: 6:41:23, time: 0.639, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0409, loss_cls: 0.1688, acc: 93.7639, loss_bbox: 0.2177, loss_mask: 0.2293, loss: 0.6764 2024-06-01 08:04:15,747 - mmdet - INFO - Saving checkpoint at 7 epochs 2024-06-01 08:05:55,592 - mmdet - INFO - Evaluating bbox... 2024-06-01 08:06:18,705 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.452 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.693 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.494 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.273 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.497 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.612 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.573 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.573 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.573 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.377 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.627 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.738 2024-06-01 08:06:18,706 - mmdet - INFO - Evaluating segm... 2024-06-01 08:06:42,368 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.406 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.658 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.427 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.190 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.447 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.630 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.306 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.574 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.713 2024-06-01 08:06:42,690 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 08:06:42,692 - mmdet - INFO - Epoch(val) [7][625] bbox_mAP: 0.4520, bbox_mAP_50: 0.6930, bbox_mAP_75: 0.4940, bbox_mAP_s: 0.2730, bbox_mAP_m: 0.4970, bbox_mAP_l: 0.6120, bbox_mAP_copypaste: 0.452 0.693 0.494 0.273 0.497 0.612, segm_mAP: 0.4060, segm_mAP_50: 0.6580, segm_mAP_75: 0.4270, segm_mAP_s: 0.1900, segm_mAP_m: 0.4470, segm_mAP_l: 0.6300, segm_mAP_copypaste: 0.406 0.658 0.427 0.190 0.447 0.630 2024-06-01 08:07:27,547 - mmdet - INFO - Epoch [8][50/7330] lr: 1.000e-04, eta: 6:40:25, time: 0.897, data_time: 0.097, memory: 13277, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0435, loss_cls: 0.1800, acc: 93.1819, loss_bbox: 0.2380, loss_mask: 0.2313, loss: 0.7102 2024-06-01 08:07:59,863 - mmdet - INFO - Epoch [8][100/7330] lr: 1.000e-04, eta: 6:39:52, time: 0.646, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0385, loss_cls: 0.1664, acc: 93.6885, loss_bbox: 0.2231, loss_mask: 0.2245, loss: 0.6690 2024-06-01 08:08:31,699 - mmdet - INFO - Epoch [8][150/7330] lr: 1.000e-04, eta: 6:39:19, time: 0.637, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0406, loss_cls: 0.1776, acc: 93.5422, loss_bbox: 0.2230, loss_mask: 0.2256, loss: 0.6844 2024-06-01 08:09:03,965 - mmdet - INFO - Epoch [8][200/7330] lr: 1.000e-04, eta: 6:38:45, time: 0.645, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0383, loss_cls: 0.1635, acc: 93.8647, loss_bbox: 0.2168, loss_mask: 0.2251, loss: 0.6612 2024-06-01 08:09:36,107 - mmdet - INFO - Epoch [8][250/7330] lr: 1.000e-04, eta: 6:38:12, time: 0.643, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0395, loss_cls: 0.1739, acc: 93.5981, loss_bbox: 0.2261, loss_mask: 0.2295, loss: 0.6865 2024-06-01 08:10:08,626 - mmdet - INFO - Epoch [8][300/7330] lr: 1.000e-04, eta: 6:37:39, time: 0.650, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0454, loss_cls: 0.1779, acc: 93.2981, loss_bbox: 0.2386, loss_mask: 0.2345, loss: 0.7152 2024-06-01 08:10:42,838 - mmdet - INFO - Epoch [8][350/7330] lr: 1.000e-04, eta: 6:37:07, time: 0.684, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0403, loss_cls: 0.1691, acc: 93.6694, loss_bbox: 0.2233, loss_mask: 0.2259, loss: 0.6748 2024-06-01 08:11:17,185 - mmdet - INFO - Epoch [8][400/7330] lr: 1.000e-04, eta: 6:36:35, time: 0.687, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0383, loss_cls: 0.1669, acc: 93.7405, loss_bbox: 0.2300, loss_mask: 0.2312, loss: 0.6822 2024-06-01 08:11:49,143 - mmdet - INFO - Epoch [8][450/7330] lr: 1.000e-04, eta: 6:36:02, time: 0.639, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0412, loss_cls: 0.1705, acc: 93.6509, loss_bbox: 0.2292, loss_mask: 0.2361, loss: 0.6945 2024-06-01 08:12:21,330 - mmdet - INFO - Epoch [8][500/7330] lr: 1.000e-04, eta: 6:35:29, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0423, loss_cls: 0.1752, acc: 93.5000, loss_bbox: 0.2307, loss_mask: 0.2304, loss: 0.6956 2024-06-01 08:12:53,570 - mmdet - INFO - Epoch [8][550/7330] lr: 1.000e-04, eta: 6:34:56, time: 0.645, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0402, loss_cls: 0.1705, acc: 93.6196, loss_bbox: 0.2295, loss_mask: 0.2270, loss: 0.6835 2024-06-01 08:13:30,385 - mmdet - INFO - Epoch [8][600/7330] lr: 1.000e-04, eta: 6:34:26, time: 0.736, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0421, loss_cls: 0.1890, acc: 92.8999, loss_bbox: 0.2420, loss_mask: 0.2384, loss: 0.7300 2024-06-01 08:14:02,512 - mmdet - INFO - Epoch [8][650/7330] lr: 1.000e-04, eta: 6:33:52, time: 0.643, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0428, loss_cls: 0.1752, acc: 93.3882, loss_bbox: 0.2342, loss_mask: 0.2369, loss: 0.7086 2024-06-01 08:14:34,394 - mmdet - INFO - Epoch [8][700/7330] lr: 1.000e-04, eta: 6:33:19, time: 0.638, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0395, loss_cls: 0.1676, acc: 93.8430, loss_bbox: 0.2149, loss_mask: 0.2224, loss: 0.6604 2024-06-01 08:15:08,870 - mmdet - INFO - Epoch [8][750/7330] lr: 1.000e-04, eta: 6:32:47, time: 0.690, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0408, loss_cls: 0.1775, acc: 93.3704, loss_bbox: 0.2330, loss_mask: 0.2415, loss: 0.7114 2024-06-01 08:15:41,118 - mmdet - INFO - Epoch [8][800/7330] lr: 1.000e-04, eta: 6:32:14, time: 0.645, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0407, loss_cls: 0.1712, acc: 93.6101, loss_bbox: 0.2242, loss_mask: 0.2311, loss: 0.6855 2024-06-01 08:16:12,911 - mmdet - INFO - Epoch [8][850/7330] lr: 1.000e-04, eta: 6:31:40, time: 0.636, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0394, loss_cls: 0.1684, acc: 93.8057, loss_bbox: 0.2236, loss_mask: 0.2365, loss: 0.6855 2024-06-01 08:16:47,190 - mmdet - INFO - Epoch [8][900/7330] lr: 1.000e-04, eta: 6:31:08, time: 0.686, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0382, loss_cls: 0.1659, acc: 93.8645, loss_bbox: 0.2143, loss_mask: 0.2264, loss: 0.6613 2024-06-01 08:17:19,249 - mmdet - INFO - Epoch [8][950/7330] lr: 1.000e-04, eta: 6:30:35, time: 0.641, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0409, loss_cls: 0.1718, acc: 93.4973, loss_bbox: 0.2295, loss_mask: 0.2292, loss: 0.6886 2024-06-01 08:17:51,448 - mmdet - INFO - Epoch [8][1000/7330] lr: 1.000e-04, eta: 6:30:02, time: 0.644, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0421, loss_cls: 0.1817, acc: 93.1353, loss_bbox: 0.2399, loss_mask: 0.2346, loss: 0.7165 2024-06-01 08:18:23,611 - mmdet - INFO - Epoch [8][1050/7330] lr: 1.000e-04, eta: 6:29:29, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0430, loss_cls: 0.1715, acc: 93.4875, loss_bbox: 0.2267, loss_mask: 0.2304, loss: 0.6879 2024-06-01 08:18:55,752 - mmdet - INFO - Epoch [8][1100/7330] lr: 1.000e-04, eta: 6:28:55, time: 0.643, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0424, loss_cls: 0.1747, acc: 93.4756, loss_bbox: 0.2299, loss_mask: 0.2325, loss: 0.6969 2024-06-01 08:19:27,849 - mmdet - INFO - Epoch [8][1150/7330] lr: 1.000e-04, eta: 6:28:22, time: 0.642, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0427, loss_cls: 0.1703, acc: 93.5713, loss_bbox: 0.2244, loss_mask: 0.2276, loss: 0.6826 2024-06-01 08:20:00,000 - mmdet - INFO - Epoch [8][1200/7330] lr: 1.000e-04, eta: 6:27:49, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0400, loss_cls: 0.1786, acc: 93.3496, loss_bbox: 0.2317, loss_mask: 0.2307, loss: 0.6982 2024-06-01 08:20:34,468 - mmdet - INFO - Epoch [8][1250/7330] lr: 1.000e-04, eta: 6:27:17, time: 0.689, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0426, loss_cls: 0.1777, acc: 93.3962, loss_bbox: 0.2314, loss_mask: 0.2319, loss: 0.7003 2024-06-01 08:21:06,472 - mmdet - INFO - Epoch [8][1300/7330] lr: 1.000e-04, eta: 6:26:44, time: 0.640, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0391, loss_cls: 0.1701, acc: 93.6387, loss_bbox: 0.2229, loss_mask: 0.2264, loss: 0.6739 2024-06-01 08:21:40,368 - mmdet - INFO - Epoch [8][1350/7330] lr: 1.000e-04, eta: 6:26:12, time: 0.678, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0353, loss_cls: 0.1559, acc: 94.1599, loss_bbox: 0.2085, loss_mask: 0.2150, loss: 0.6291 2024-06-01 08:22:15,684 - mmdet - INFO - Epoch [8][1400/7330] lr: 1.000e-04, eta: 6:25:40, time: 0.706, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0411, loss_cls: 0.1736, acc: 93.5508, loss_bbox: 0.2274, loss_mask: 0.2286, loss: 0.6887 2024-06-01 08:22:49,940 - mmdet - INFO - Epoch [8][1450/7330] lr: 1.000e-04, eta: 6:25:09, time: 0.685, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0409, loss_cls: 0.1740, acc: 93.4978, loss_bbox: 0.2265, loss_mask: 0.2266, loss: 0.6857 2024-06-01 08:23:21,992 - mmdet - INFO - Epoch [8][1500/7330] lr: 1.000e-04, eta: 6:24:35, time: 0.641, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0414, loss_cls: 0.1768, acc: 93.3940, loss_bbox: 0.2258, loss_mask: 0.2357, loss: 0.6980 2024-06-01 08:23:53,828 - mmdet - INFO - Epoch [8][1550/7330] lr: 1.000e-04, eta: 6:24:02, time: 0.637, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0398, loss_cls: 0.1749, acc: 93.3896, loss_bbox: 0.2309, loss_mask: 0.2293, loss: 0.6925 2024-06-01 08:24:25,807 - mmdet - INFO - Epoch [8][1600/7330] lr: 1.000e-04, eta: 6:23:28, time: 0.640, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0398, loss_cls: 0.1707, acc: 93.6316, loss_bbox: 0.2244, loss_mask: 0.2305, loss: 0.6823 2024-06-01 08:25:02,910 - mmdet - INFO - Epoch [8][1650/7330] lr: 1.000e-04, eta: 6:22:58, time: 0.742, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0399, loss_cls: 0.1759, acc: 93.4460, loss_bbox: 0.2294, loss_mask: 0.2302, loss: 0.6923 2024-06-01 08:25:35,267 - mmdet - INFO - Epoch [8][1700/7330] lr: 1.000e-04, eta: 6:22:25, time: 0.647, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0404, loss_cls: 0.1755, acc: 93.4312, loss_bbox: 0.2324, loss_mask: 0.2341, loss: 0.7002 2024-06-01 08:26:07,261 - mmdet - INFO - Epoch [8][1750/7330] lr: 1.000e-04, eta: 6:21:52, time: 0.640, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0419, loss_cls: 0.1711, acc: 93.5620, loss_bbox: 0.2267, loss_mask: 0.2323, loss: 0.6908 2024-06-01 08:26:41,826 - mmdet - INFO - Epoch [8][1800/7330] lr: 1.000e-04, eta: 6:21:20, time: 0.691, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0401, loss_cls: 0.1772, acc: 93.4509, loss_bbox: 0.2265, loss_mask: 0.2354, loss: 0.6979 2024-06-01 08:27:14,074 - mmdet - INFO - Epoch [8][1850/7330] lr: 1.000e-04, eta: 6:20:47, time: 0.645, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0410, loss_cls: 0.1748, acc: 93.4883, loss_bbox: 0.2334, loss_mask: 0.2319, loss: 0.6980 2024-06-01 08:27:46,155 - mmdet - INFO - Epoch [8][1900/7330] lr: 1.000e-04, eta: 6:20:14, time: 0.642, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0398, loss_cls: 0.1708, acc: 93.6880, loss_bbox: 0.2261, loss_mask: 0.2331, loss: 0.6872 2024-06-01 08:28:20,039 - mmdet - INFO - Epoch [8][1950/7330] lr: 1.000e-04, eta: 6:19:41, time: 0.678, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0406, loss_cls: 0.1737, acc: 93.4714, loss_bbox: 0.2275, loss_mask: 0.2333, loss: 0.6927 2024-06-01 08:28:51,705 - mmdet - INFO - Epoch [8][2000/7330] lr: 1.000e-04, eta: 6:19:08, time: 0.633, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0378, loss_cls: 0.1638, acc: 93.8821, loss_bbox: 0.2171, loss_mask: 0.2265, loss: 0.6618 2024-06-01 08:29:23,856 - mmdet - INFO - Epoch [8][2050/7330] lr: 1.000e-04, eta: 6:18:35, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0435, loss_cls: 0.1696, acc: 93.6206, loss_bbox: 0.2287, loss_mask: 0.2316, loss: 0.6920 2024-06-01 08:29:56,214 - mmdet - INFO - Epoch [8][2100/7330] lr: 1.000e-04, eta: 6:18:02, time: 0.647, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0398, loss_cls: 0.1708, acc: 93.6697, loss_bbox: 0.2252, loss_mask: 0.2278, loss: 0.6815 2024-06-01 08:30:28,379 - mmdet - INFO - Epoch [8][2150/7330] lr: 1.000e-04, eta: 6:17:28, time: 0.643, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0403, loss_cls: 0.1713, acc: 93.5076, loss_bbox: 0.2257, loss_mask: 0.2307, loss: 0.6853 2024-06-01 08:31:00,308 - mmdet - INFO - Epoch [8][2200/7330] lr: 1.000e-04, eta: 6:16:55, time: 0.639, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0401, loss_cls: 0.1744, acc: 93.4468, loss_bbox: 0.2316, loss_mask: 0.2275, loss: 0.6914 2024-06-01 08:31:32,474 - mmdet - INFO - Epoch [8][2250/7330] lr: 1.000e-04, eta: 6:16:22, time: 0.643, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0427, loss_cls: 0.1803, acc: 93.2573, loss_bbox: 0.2367, loss_mask: 0.2327, loss: 0.7110 2024-06-01 08:32:06,473 - mmdet - INFO - Epoch [8][2300/7330] lr: 1.000e-04, eta: 6:15:50, time: 0.680, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0397, loss_cls: 0.1648, acc: 93.8682, loss_bbox: 0.2257, loss_mask: 0.2373, loss: 0.6852 2024-06-01 08:32:38,399 - mmdet - INFO - Epoch [8][2350/7330] lr: 1.000e-04, eta: 6:15:16, time: 0.639, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0373, loss_cls: 0.1711, acc: 93.6948, loss_bbox: 0.2239, loss_mask: 0.2319, loss: 0.6809 2024-06-01 08:33:12,883 - mmdet - INFO - Epoch [8][2400/7330] lr: 1.000e-04, eta: 6:14:44, time: 0.690, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0406, loss_cls: 0.1710, acc: 93.6426, loss_bbox: 0.2255, loss_mask: 0.2277, loss: 0.6809 2024-06-01 08:33:47,765 - mmdet - INFO - Epoch [8][2450/7330] lr: 1.000e-04, eta: 6:14:13, time: 0.698, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0408, loss_cls: 0.1748, acc: 93.5090, loss_bbox: 0.2249, loss_mask: 0.2336, loss: 0.6919 2024-06-01 08:34:22,010 - mmdet - INFO - Epoch [8][2500/7330] lr: 1.000e-04, eta: 6:13:41, time: 0.685, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0408, loss_cls: 0.1754, acc: 93.4907, loss_bbox: 0.2310, loss_mask: 0.2307, loss: 0.6952 2024-06-01 08:34:54,551 - mmdet - INFO - Epoch [8][2550/7330] lr: 1.000e-04, eta: 6:13:08, time: 0.650, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0450, loss_cls: 0.1713, acc: 93.5991, loss_bbox: 0.2232, loss_mask: 0.2321, loss: 0.6913 2024-06-01 08:35:26,519 - mmdet - INFO - Epoch [8][2600/7330] lr: 1.000e-04, eta: 6:12:35, time: 0.640, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0403, loss_cls: 0.1742, acc: 93.4575, loss_bbox: 0.2302, loss_mask: 0.2303, loss: 0.6926 2024-06-01 08:35:58,700 - mmdet - INFO - Epoch [8][2650/7330] lr: 1.000e-04, eta: 6:12:01, time: 0.644, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0427, loss_cls: 0.1761, acc: 93.4277, loss_bbox: 0.2310, loss_mask: 0.2318, loss: 0.6987 2024-06-01 08:36:37,246 - mmdet - INFO - Epoch [8][2700/7330] lr: 1.000e-04, eta: 6:11:32, time: 0.771, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0394, loss_cls: 0.1718, acc: 93.5718, loss_bbox: 0.2264, loss_mask: 0.2321, loss: 0.6873 2024-06-01 08:37:09,546 - mmdet - INFO - Epoch [8][2750/7330] lr: 1.000e-04, eta: 6:10:59, time: 0.646, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0399, loss_cls: 0.1746, acc: 93.4624, loss_bbox: 0.2317, loss_mask: 0.2377, loss: 0.7040 2024-06-01 08:37:41,544 - mmdet - INFO - Epoch [8][2800/7330] lr: 1.000e-04, eta: 6:10:26, time: 0.640, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0391, loss_cls: 0.1709, acc: 93.6670, loss_bbox: 0.2198, loss_mask: 0.2293, loss: 0.6756 2024-06-01 08:38:15,682 - mmdet - INFO - Epoch [8][2850/7330] lr: 1.000e-04, eta: 6:09:54, time: 0.683, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0402, loss_cls: 0.1719, acc: 93.6094, loss_bbox: 0.2249, loss_mask: 0.2341, loss: 0.6877 2024-06-01 08:38:47,980 - mmdet - INFO - Epoch [8][2900/7330] lr: 1.000e-04, eta: 6:09:20, time: 0.646, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0419, loss_cls: 0.1706, acc: 93.6562, loss_bbox: 0.2258, loss_mask: 0.2298, loss: 0.6876 2024-06-01 08:39:19,943 - mmdet - INFO - Epoch [8][2950/7330] lr: 1.000e-04, eta: 6:08:47, time: 0.639, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0419, loss_cls: 0.1750, acc: 93.4514, loss_bbox: 0.2273, loss_mask: 0.2282, loss: 0.6909 2024-06-01 08:39:54,551 - mmdet - INFO - Epoch [8][3000/7330] lr: 1.000e-04, eta: 6:08:15, time: 0.692, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0425, loss_cls: 0.1783, acc: 93.3455, loss_bbox: 0.2335, loss_mask: 0.2328, loss: 0.7068 2024-06-01 08:40:26,594 - mmdet - INFO - Epoch [8][3050/7330] lr: 1.000e-04, eta: 6:07:42, time: 0.641, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0417, loss_cls: 0.1791, acc: 93.2969, loss_bbox: 0.2314, loss_mask: 0.2308, loss: 0.7010 2024-06-01 08:40:59,085 - mmdet - INFO - Epoch [8][3100/7330] lr: 1.000e-04, eta: 6:07:09, time: 0.650, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0411, loss_cls: 0.1771, acc: 93.4209, loss_bbox: 0.2319, loss_mask: 0.2338, loss: 0.7019 2024-06-01 08:41:31,072 - mmdet - INFO - Epoch [8][3150/7330] lr: 1.000e-04, eta: 6:06:36, time: 0.640, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0422, loss_cls: 0.1765, acc: 93.4084, loss_bbox: 0.2326, loss_mask: 0.2349, loss: 0.7032 2024-06-01 08:42:03,212 - mmdet - INFO - Epoch [8][3200/7330] lr: 1.000e-04, eta: 6:06:02, time: 0.643, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0433, loss_cls: 0.1730, acc: 93.6331, loss_bbox: 0.2241, loss_mask: 0.2330, loss: 0.6925 2024-06-01 08:42:35,550 - mmdet - INFO - Epoch [8][3250/7330] lr: 1.000e-04, eta: 6:05:29, time: 0.647, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0403, loss_cls: 0.1725, acc: 93.5662, loss_bbox: 0.2260, loss_mask: 0.2279, loss: 0.6854 2024-06-01 08:43:07,931 - mmdet - INFO - Epoch [8][3300/7330] lr: 1.000e-04, eta: 6:04:56, time: 0.648, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0452, loss_cls: 0.1846, acc: 93.1553, loss_bbox: 0.2390, loss_mask: 0.2384, loss: 0.7269 2024-06-01 08:43:42,154 - mmdet - INFO - Epoch [8][3350/7330] lr: 1.000e-04, eta: 6:04:24, time: 0.684, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0395, loss_cls: 0.1704, acc: 93.6589, loss_bbox: 0.2210, loss_mask: 0.2306, loss: 0.6793 2024-06-01 08:44:14,210 - mmdet - INFO - Epoch [8][3400/7330] lr: 1.000e-04, eta: 6:03:51, time: 0.641, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0392, loss_cls: 0.1680, acc: 93.7898, loss_bbox: 0.2199, loss_mask: 0.2268, loss: 0.6711 2024-06-01 08:44:48,359 - mmdet - INFO - Epoch [8][3450/7330] lr: 1.000e-04, eta: 6:03:19, time: 0.683, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0426, loss_cls: 0.1701, acc: 93.6582, loss_bbox: 0.2231, loss_mask: 0.2328, loss: 0.6863 2024-06-01 08:45:22,903 - mmdet - INFO - Epoch [8][3500/7330] lr: 1.000e-04, eta: 6:02:47, time: 0.691, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0390, loss_cls: 0.1663, acc: 93.8777, loss_bbox: 0.2219, loss_mask: 0.2365, loss: 0.6795 2024-06-01 08:45:57,113 - mmdet - INFO - Epoch [8][3550/7330] lr: 1.000e-04, eta: 6:02:15, time: 0.684, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0386, loss_cls: 0.1668, acc: 93.7803, loss_bbox: 0.2196, loss_mask: 0.2279, loss: 0.6688 2024-06-01 08:46:29,302 - mmdet - INFO - Epoch [8][3600/7330] lr: 1.000e-04, eta: 6:01:42, time: 0.644, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0406, loss_cls: 0.1720, acc: 93.6753, loss_bbox: 0.2247, loss_mask: 0.2330, loss: 0.6893 2024-06-01 08:47:01,217 - mmdet - INFO - Epoch [8][3650/7330] lr: 1.000e-04, eta: 6:01:08, time: 0.638, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0416, loss_cls: 0.1702, acc: 93.7229, loss_bbox: 0.2236, loss_mask: 0.2368, loss: 0.6915 2024-06-01 08:47:33,433 - mmdet - INFO - Epoch [8][3700/7330] lr: 1.000e-04, eta: 6:00:35, time: 0.644, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0410, loss_cls: 0.1779, acc: 93.2717, loss_bbox: 0.2317, loss_mask: 0.2331, loss: 0.7003 2024-06-01 08:48:11,450 - mmdet - INFO - Epoch [8][3750/7330] lr: 1.000e-04, eta: 6:00:05, time: 0.760, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0394, loss_cls: 0.1719, acc: 93.6162, loss_bbox: 0.2232, loss_mask: 0.2261, loss: 0.6777 2024-06-01 08:48:43,672 - mmdet - INFO - Epoch [8][3800/7330] lr: 1.000e-04, eta: 5:59:32, time: 0.645, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0416, loss_cls: 0.1745, acc: 93.5273, loss_bbox: 0.2244, loss_mask: 0.2313, loss: 0.6909 2024-06-01 08:49:15,680 - mmdet - INFO - Epoch [8][3850/7330] lr: 1.000e-04, eta: 5:58:59, time: 0.640, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0402, loss_cls: 0.1703, acc: 93.6990, loss_bbox: 0.2230, loss_mask: 0.2275, loss: 0.6777 2024-06-01 08:49:50,119 - mmdet - INFO - Epoch [8][3900/7330] lr: 1.000e-04, eta: 5:58:27, time: 0.689, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0438, loss_cls: 0.1803, acc: 93.2039, loss_bbox: 0.2415, loss_mask: 0.2389, loss: 0.7241 2024-06-01 08:50:22,424 - mmdet - INFO - Epoch [8][3950/7330] lr: 1.000e-04, eta: 5:57:54, time: 0.646, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0409, loss_cls: 0.1780, acc: 93.3816, loss_bbox: 0.2279, loss_mask: 0.2287, loss: 0.6931 2024-06-01 08:50:54,282 - mmdet - INFO - Epoch [8][4000/7330] lr: 1.000e-04, eta: 5:57:20, time: 0.637, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0385, loss_cls: 0.1662, acc: 93.8601, loss_bbox: 0.2207, loss_mask: 0.2341, loss: 0.6765 2024-06-01 08:51:29,026 - mmdet - INFO - Epoch [8][4050/7330] lr: 1.000e-04, eta: 5:56:49, time: 0.695, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0418, loss_cls: 0.1789, acc: 93.3101, loss_bbox: 0.2319, loss_mask: 0.2310, loss: 0.7012 2024-06-01 08:52:01,245 - mmdet - INFO - Epoch [8][4100/7330] lr: 1.000e-04, eta: 5:56:16, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0372, loss_cls: 0.1632, acc: 93.9202, loss_bbox: 0.2145, loss_mask: 0.2254, loss: 0.6577 2024-06-01 08:52:33,397 - mmdet - INFO - Epoch [8][4150/7330] lr: 1.000e-04, eta: 5:55:42, time: 0.644, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0405, loss_cls: 0.1800, acc: 93.1938, loss_bbox: 0.2324, loss_mask: 0.2343, loss: 0.7045 2024-06-01 08:53:06,034 - mmdet - INFO - Epoch [8][4200/7330] lr: 1.000e-04, eta: 5:55:09, time: 0.653, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0429, loss_cls: 0.1808, acc: 93.2537, loss_bbox: 0.2307, loss_mask: 0.2310, loss: 0.7041 2024-06-01 08:53:38,474 - mmdet - INFO - Epoch [8][4250/7330] lr: 1.000e-04, eta: 5:54:36, time: 0.649, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0393, loss_cls: 0.1747, acc: 93.6189, loss_bbox: 0.2188, loss_mask: 0.2250, loss: 0.6764 2024-06-01 08:54:10,509 - mmdet - INFO - Epoch [8][4300/7330] lr: 1.000e-04, eta: 5:54:03, time: 0.640, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0411, loss_cls: 0.1726, acc: 93.5400, loss_bbox: 0.2269, loss_mask: 0.2312, loss: 0.6898 2024-06-01 08:54:42,545 - mmdet - INFO - Epoch [8][4350/7330] lr: 1.000e-04, eta: 5:53:30, time: 0.641, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0400, loss_cls: 0.1736, acc: 93.5408, loss_bbox: 0.2237, loss_mask: 0.2304, loss: 0.6854 2024-06-01 08:55:16,502 - mmdet - INFO - Epoch [8][4400/7330] lr: 1.000e-04, eta: 5:52:58, time: 0.679, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0384, loss_cls: 0.1691, acc: 93.7332, loss_bbox: 0.2209, loss_mask: 0.2289, loss: 0.6726 2024-06-01 08:55:48,769 - mmdet - INFO - Epoch [8][4450/7330] lr: 1.000e-04, eta: 5:52:24, time: 0.645, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0409, loss_cls: 0.1767, acc: 93.4968, loss_bbox: 0.2318, loss_mask: 0.2321, loss: 0.7000 2024-06-01 08:56:22,841 - mmdet - INFO - Epoch [8][4500/7330] lr: 1.000e-04, eta: 5:51:52, time: 0.682, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0368, loss_cls: 0.1665, acc: 93.8123, loss_bbox: 0.2186, loss_mask: 0.2292, loss: 0.6686 2024-06-01 08:56:56,924 - mmdet - INFO - Epoch [8][4550/7330] lr: 1.000e-04, eta: 5:51:20, time: 0.682, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0412, loss_cls: 0.1780, acc: 93.4131, loss_bbox: 0.2300, loss_mask: 0.2354, loss: 0.7032 2024-06-01 08:57:31,316 - mmdet - INFO - Epoch [8][4600/7330] lr: 1.000e-04, eta: 5:50:48, time: 0.688, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0394, loss_cls: 0.1696, acc: 93.7600, loss_bbox: 0.2219, loss_mask: 0.2259, loss: 0.6741 2024-06-01 08:58:03,340 - mmdet - INFO - Epoch [8][4650/7330] lr: 1.000e-04, eta: 5:50:15, time: 0.640, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0408, loss_cls: 0.1742, acc: 93.6606, loss_bbox: 0.2263, loss_mask: 0.2318, loss: 0.6921 2024-06-01 08:58:35,277 - mmdet - INFO - Epoch [8][4700/7330] lr: 1.000e-04, eta: 5:49:42, time: 0.639, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0387, loss_cls: 0.1693, acc: 93.7131, loss_bbox: 0.2169, loss_mask: 0.2342, loss: 0.6769 2024-06-01 08:59:07,761 - mmdet - INFO - Epoch [8][4750/7330] lr: 1.000e-04, eta: 5:49:09, time: 0.650, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0394, loss_cls: 0.1772, acc: 93.4666, loss_bbox: 0.2308, loss_mask: 0.2365, loss: 0.7012 2024-06-01 08:59:45,098 - mmdet - INFO - Epoch [8][4800/7330] lr: 1.000e-04, eta: 5:48:38, time: 0.747, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0410, loss_cls: 0.1765, acc: 93.3923, loss_bbox: 0.2333, loss_mask: 0.2363, loss: 0.7041 2024-06-01 09:00:16,955 - mmdet - INFO - Epoch [8][4850/7330] lr: 1.000e-04, eta: 5:48:05, time: 0.637, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0398, loss_cls: 0.1701, acc: 93.5698, loss_bbox: 0.2254, loss_mask: 0.2346, loss: 0.6879 2024-06-01 09:00:49,229 - mmdet - INFO - Epoch [8][4900/7330] lr: 1.000e-04, eta: 5:47:32, time: 0.645, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0362, loss_cls: 0.1677, acc: 93.7251, loss_bbox: 0.2191, loss_mask: 0.2320, loss: 0.6722 2024-06-01 09:01:23,577 - mmdet - INFO - Epoch [8][4950/7330] lr: 1.000e-04, eta: 5:47:00, time: 0.687, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0399, loss_cls: 0.1722, acc: 93.4736, loss_bbox: 0.2321, loss_mask: 0.2301, loss: 0.6929 2024-06-01 09:01:55,720 - mmdet - INFO - Epoch [8][5000/7330] lr: 1.000e-04, eta: 5:46:26, time: 0.643, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0392, loss_cls: 0.1663, acc: 93.8137, loss_bbox: 0.2185, loss_mask: 0.2293, loss: 0.6706 2024-06-01 09:02:27,727 - mmdet - INFO - Epoch [8][5050/7330] lr: 1.000e-04, eta: 5:45:53, time: 0.640, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0420, loss_cls: 0.1755, acc: 93.4480, loss_bbox: 0.2288, loss_mask: 0.2290, loss: 0.6945 2024-06-01 09:03:01,756 - mmdet - INFO - Epoch [8][5100/7330] lr: 1.000e-04, eta: 5:45:21, time: 0.681, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0415, loss_cls: 0.1833, acc: 93.3101, loss_bbox: 0.2366, loss_mask: 0.2346, loss: 0.7130 2024-06-01 09:03:34,071 - mmdet - INFO - Epoch [8][5150/7330] lr: 1.000e-04, eta: 5:44:48, time: 0.646, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0405, loss_cls: 0.1672, acc: 93.7480, loss_bbox: 0.2247, loss_mask: 0.2250, loss: 0.6746 2024-06-01 09:04:06,140 - mmdet - INFO - Epoch [8][5200/7330] lr: 1.000e-04, eta: 5:44:15, time: 0.642, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0404, loss_cls: 0.1699, acc: 93.6406, loss_bbox: 0.2202, loss_mask: 0.2299, loss: 0.6777 2024-06-01 09:04:38,083 - mmdet - INFO - Epoch [8][5250/7330] lr: 1.000e-04, eta: 5:43:41, time: 0.639, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0417, loss_cls: 0.1718, acc: 93.5652, loss_bbox: 0.2246, loss_mask: 0.2258, loss: 0.6823 2024-06-01 09:05:10,237 - mmdet - INFO - Epoch [8][5300/7330] lr: 1.000e-04, eta: 5:43:08, time: 0.643, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0412, loss_cls: 0.1740, acc: 93.5315, loss_bbox: 0.2253, loss_mask: 0.2360, loss: 0.6954 2024-06-01 09:05:42,238 - mmdet - INFO - Epoch [8][5350/7330] lr: 1.000e-04, eta: 5:42:35, time: 0.640, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0396, loss_cls: 0.1786, acc: 93.3801, loss_bbox: 0.2329, loss_mask: 0.2357, loss: 0.7052 2024-06-01 09:06:14,411 - mmdet - INFO - Epoch [8][5400/7330] lr: 1.000e-04, eta: 5:42:02, time: 0.643, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0395, loss_cls: 0.1697, acc: 93.7253, loss_bbox: 0.2227, loss_mask: 0.2252, loss: 0.6745 2024-06-01 09:06:48,271 - mmdet - INFO - Epoch [8][5450/7330] lr: 1.000e-04, eta: 5:41:29, time: 0.677, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0399, loss_cls: 0.1629, acc: 93.9622, loss_bbox: 0.2082, loss_mask: 0.2229, loss: 0.6512 2024-06-01 09:07:20,354 - mmdet - INFO - Epoch [8][5500/7330] lr: 1.000e-04, eta: 5:40:56, time: 0.642, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0377, loss_cls: 0.1703, acc: 93.6396, loss_bbox: 0.2260, loss_mask: 0.2267, loss: 0.6776 2024-06-01 09:07:54,697 - mmdet - INFO - Epoch [8][5550/7330] lr: 1.000e-04, eta: 5:40:24, time: 0.687, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0432, loss_cls: 0.1754, acc: 93.5315, loss_bbox: 0.2286, loss_mask: 0.2377, loss: 0.7031 2024-06-01 09:08:28,694 - mmdet - INFO - Epoch [8][5600/7330] lr: 1.000e-04, eta: 5:39:52, time: 0.680, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0388, loss_cls: 0.1666, acc: 93.7012, loss_bbox: 0.2251, loss_mask: 0.2292, loss: 0.6748 2024-06-01 09:09:02,800 - mmdet - INFO - Epoch [8][5650/7330] lr: 1.000e-04, eta: 5:39:20, time: 0.682, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0394, loss_cls: 0.1707, acc: 93.6531, loss_bbox: 0.2211, loss_mask: 0.2276, loss: 0.6757 2024-06-01 09:09:34,884 - mmdet - INFO - Epoch [8][5700/7330] lr: 1.000e-04, eta: 5:38:46, time: 0.642, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0391, loss_cls: 0.1703, acc: 93.6694, loss_bbox: 0.2197, loss_mask: 0.2298, loss: 0.6763 2024-06-01 09:10:06,967 - mmdet - INFO - Epoch [8][5750/7330] lr: 1.000e-04, eta: 5:38:13, time: 0.641, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0394, loss_cls: 0.1716, acc: 93.7266, loss_bbox: 0.2239, loss_mask: 0.2324, loss: 0.6848 2024-06-01 09:10:38,874 - mmdet - INFO - Epoch [8][5800/7330] lr: 1.000e-04, eta: 5:37:40, time: 0.639, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0397, loss_cls: 0.1758, acc: 93.4409, loss_bbox: 0.2278, loss_mask: 0.2338, loss: 0.6940 2024-06-01 09:11:15,921 - mmdet - INFO - Epoch [8][5850/7330] lr: 1.000e-04, eta: 5:37:09, time: 0.741, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0382, loss_cls: 0.1664, acc: 93.8069, loss_bbox: 0.2208, loss_mask: 0.2266, loss: 0.6689 2024-06-01 09:11:48,144 - mmdet - INFO - Epoch [8][5900/7330] lr: 1.000e-04, eta: 5:36:36, time: 0.644, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0419, loss_cls: 0.1755, acc: 93.3860, loss_bbox: 0.2334, loss_mask: 0.2337, loss: 0.7044 2024-06-01 09:12:20,033 - mmdet - INFO - Epoch [8][5950/7330] lr: 1.000e-04, eta: 5:36:03, time: 0.638, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0374, loss_cls: 0.1627, acc: 93.9839, loss_bbox: 0.2112, loss_mask: 0.2271, loss: 0.6544 2024-06-01 09:12:54,899 - mmdet - INFO - Epoch [8][6000/7330] lr: 1.000e-04, eta: 5:35:31, time: 0.697, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0414, loss_cls: 0.1801, acc: 93.2771, loss_bbox: 0.2368, loss_mask: 0.2346, loss: 0.7123 2024-06-01 09:13:27,099 - mmdet - INFO - Epoch [8][6050/7330] lr: 1.000e-04, eta: 5:34:58, time: 0.644, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0452, loss_cls: 0.1868, acc: 92.9768, loss_bbox: 0.2458, loss_mask: 0.2400, loss: 0.7378 2024-06-01 09:13:59,517 - mmdet - INFO - Epoch [8][6100/7330] lr: 1.000e-04, eta: 5:34:25, time: 0.648, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0439, loss_cls: 0.1834, acc: 93.2334, loss_bbox: 0.2363, loss_mask: 0.2369, loss: 0.7217 2024-06-01 09:14:31,539 - mmdet - INFO - Epoch [8][6150/7330] lr: 1.000e-04, eta: 5:33:51, time: 0.640, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0395, loss_cls: 0.1692, acc: 93.6785, loss_bbox: 0.2170, loss_mask: 0.2263, loss: 0.6696 2024-06-01 09:15:06,085 - mmdet - INFO - Epoch [8][6200/7330] lr: 1.000e-04, eta: 5:33:20, time: 0.691, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0409, loss_cls: 0.1738, acc: 93.4734, loss_bbox: 0.2312, loss_mask: 0.2332, loss: 0.6980 2024-06-01 09:15:38,223 - mmdet - INFO - Epoch [8][6250/7330] lr: 1.000e-04, eta: 5:32:46, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0403, loss_cls: 0.1721, acc: 93.5891, loss_bbox: 0.2278, loss_mask: 0.2280, loss: 0.6871 2024-06-01 09:16:10,643 - mmdet - INFO - Epoch [8][6300/7330] lr: 1.000e-04, eta: 5:32:13, time: 0.648, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0405, loss_cls: 0.1699, acc: 93.6270, loss_bbox: 0.2252, loss_mask: 0.2312, loss: 0.6851 2024-06-01 09:16:42,980 - mmdet - INFO - Epoch [8][6350/7330] lr: 1.000e-04, eta: 5:31:40, time: 0.647, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0408, loss_cls: 0.1744, acc: 93.5552, loss_bbox: 0.2263, loss_mask: 0.2305, loss: 0.6901 2024-06-01 09:17:15,246 - mmdet - INFO - Epoch [8][6400/7330] lr: 1.000e-04, eta: 5:31:07, time: 0.645, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0384, loss_cls: 0.1750, acc: 93.3579, loss_bbox: 0.2280, loss_mask: 0.2237, loss: 0.6826 2024-06-01 09:17:47,215 - mmdet - INFO - Epoch [8][6450/7330] lr: 1.000e-04, eta: 5:30:34, time: 0.639, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0404, loss_cls: 0.1735, acc: 93.6255, loss_bbox: 0.2265, loss_mask: 0.2300, loss: 0.6870 2024-06-01 09:18:21,285 - mmdet - INFO - Epoch [8][6500/7330] lr: 1.000e-04, eta: 5:30:02, time: 0.681, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0365, loss_cls: 0.1645, acc: 93.9878, loss_bbox: 0.2148, loss_mask: 0.2247, loss: 0.6574 2024-06-01 09:18:53,479 - mmdet - INFO - Epoch [8][6550/7330] lr: 1.000e-04, eta: 5:29:28, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0396, loss_cls: 0.1679, acc: 93.6157, loss_bbox: 0.2238, loss_mask: 0.2330, loss: 0.6830 2024-06-01 09:19:27,738 - mmdet - INFO - Epoch [8][6600/7330] lr: 1.000e-04, eta: 5:28:56, time: 0.685, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0406, loss_cls: 0.1731, acc: 93.6257, loss_bbox: 0.2268, loss_mask: 0.2299, loss: 0.6875 2024-06-01 09:20:00,014 - mmdet - INFO - Epoch [8][6650/7330] lr: 1.000e-04, eta: 5:28:23, time: 0.645, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0396, loss_cls: 0.1764, acc: 93.4221, loss_bbox: 0.2328, loss_mask: 0.2331, loss: 0.6994 2024-06-01 09:20:36,936 - mmdet - INFO - Epoch [8][6700/7330] lr: 1.000e-04, eta: 5:27:52, time: 0.738, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0396, loss_cls: 0.1770, acc: 93.4968, loss_bbox: 0.2254, loss_mask: 0.2270, loss: 0.6862 2024-06-01 09:21:08,849 - mmdet - INFO - Epoch [8][6750/7330] lr: 1.000e-04, eta: 5:27:19, time: 0.638, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0409, loss_cls: 0.1754, acc: 93.5115, loss_bbox: 0.2247, loss_mask: 0.2307, loss: 0.6906 2024-06-01 09:21:40,756 - mmdet - INFO - Epoch [8][6800/7330] lr: 1.000e-04, eta: 5:26:46, time: 0.638, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0415, loss_cls: 0.1797, acc: 93.4568, loss_bbox: 0.2299, loss_mask: 0.2322, loss: 0.7016 2024-06-01 09:22:12,927 - mmdet - INFO - Epoch [8][6850/7330] lr: 1.000e-04, eta: 5:26:13, time: 0.643, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0412, loss_cls: 0.1777, acc: 93.4290, loss_bbox: 0.2272, loss_mask: 0.2340, loss: 0.6985 2024-06-01 09:22:50,040 - mmdet - INFO - Epoch [8][6900/7330] lr: 1.000e-04, eta: 5:25:42, time: 0.742, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0408, loss_cls: 0.1794, acc: 93.3171, loss_bbox: 0.2338, loss_mask: 0.2336, loss: 0.7062 2024-06-01 09:23:22,140 - mmdet - INFO - Epoch [8][6950/7330] lr: 1.000e-04, eta: 5:25:09, time: 0.642, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0424, loss_cls: 0.1803, acc: 93.3723, loss_bbox: 0.2312, loss_mask: 0.2352, loss: 0.7092 2024-06-01 09:23:54,293 - mmdet - INFO - Epoch [8][7000/7330] lr: 1.000e-04, eta: 5:24:35, time: 0.643, data_time: 0.023, memory: 13277, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0405, loss_cls: 0.1698, acc: 93.7432, loss_bbox: 0.2236, loss_mask: 0.2306, loss: 0.6811 2024-06-01 09:24:28,550 - mmdet - INFO - Epoch [8][7050/7330] lr: 1.000e-04, eta: 5:24:03, time: 0.685, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0391, loss_cls: 0.1704, acc: 93.7207, loss_bbox: 0.2202, loss_mask: 0.2273, loss: 0.6746 2024-06-01 09:25:00,441 - mmdet - INFO - Epoch [8][7100/7330] lr: 1.000e-04, eta: 5:23:30, time: 0.638, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0388, loss_cls: 0.1754, acc: 93.5117, loss_bbox: 0.2281, loss_mask: 0.2318, loss: 0.6913 2024-06-01 09:25:32,884 - mmdet - INFO - Epoch [8][7150/7330] lr: 1.000e-04, eta: 5:22:57, time: 0.649, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0414, loss_cls: 0.1768, acc: 93.4160, loss_bbox: 0.2295, loss_mask: 0.2317, loss: 0.6983 2024-06-01 09:26:04,802 - mmdet - INFO - Epoch [8][7200/7330] lr: 1.000e-04, eta: 5:22:24, time: 0.638, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0375, loss_cls: 0.1688, acc: 93.7471, loss_bbox: 0.2218, loss_mask: 0.2284, loss: 0.6730 2024-06-01 09:26:39,162 - mmdet - INFO - Epoch [8][7250/7330] lr: 1.000e-04, eta: 5:21:52, time: 0.687, data_time: 0.023, memory: 13277, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0390, loss_cls: 0.1760, acc: 93.4980, loss_bbox: 0.2263, loss_mask: 0.2304, loss: 0.6890 2024-06-01 09:27:11,023 - mmdet - INFO - Epoch [8][7300/7330] lr: 1.000e-04, eta: 5:21:18, time: 0.637, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0385, loss_cls: 0.1628, acc: 93.9521, loss_bbox: 0.2138, loss_mask: 0.2222, loss: 0.6531 2024-06-01 09:27:31,066 - mmdet - INFO - Saving checkpoint at 8 epochs 2024-06-01 09:29:09,130 - mmdet - INFO - Evaluating bbox... 2024-06-01 09:29:31,582 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.462 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.698 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.506 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.280 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.504 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.631 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.578 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.578 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.578 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.377 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.626 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.750 2024-06-01 09:29:31,582 - mmdet - INFO - Evaluating segm... 2024-06-01 09:29:54,410 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.413 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.661 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.439 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.193 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.450 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.636 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.520 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.520 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.520 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.302 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.574 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.719 2024-06-01 09:29:54,730 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 09:29:54,732 - mmdet - INFO - Epoch(val) [8][625] bbox_mAP: 0.4620, bbox_mAP_50: 0.6980, bbox_mAP_75: 0.5060, bbox_mAP_s: 0.2800, bbox_mAP_m: 0.5040, bbox_mAP_l: 0.6310, bbox_mAP_copypaste: 0.462 0.698 0.506 0.280 0.504 0.631, segm_mAP: 0.4130, segm_mAP_50: 0.6610, segm_mAP_75: 0.4390, segm_mAP_s: 0.1930, segm_mAP_m: 0.4500, segm_mAP_l: 0.6360, segm_mAP_copypaste: 0.413 0.661 0.439 0.193 0.450 0.636 2024-06-01 09:30:43,084 - mmdet - INFO - Epoch [9][50/7330] lr: 1.000e-05, eta: 5:20:24, time: 0.967, data_time: 0.096, memory: 13277, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0384, loss_cls: 0.1590, acc: 94.0447, loss_bbox: 0.2106, loss_mask: 0.2215, loss: 0.6447 2024-06-01 09:31:15,163 - mmdet - INFO - Epoch [9][100/7330] lr: 1.000e-05, eta: 5:19:50, time: 0.642, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0371, loss_cls: 0.1619, acc: 93.9221, loss_bbox: 0.2174, loss_mask: 0.2213, loss: 0.6547 2024-06-01 09:31:47,062 - mmdet - INFO - Epoch [9][150/7330] lr: 1.000e-05, eta: 5:19:17, time: 0.638, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0359, loss_cls: 0.1579, acc: 94.0994, loss_bbox: 0.2065, loss_mask: 0.2180, loss: 0.6337 2024-06-01 09:32:19,276 - mmdet - INFO - Epoch [9][200/7330] lr: 1.000e-05, eta: 5:18:44, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0398, loss_cls: 0.1638, acc: 93.8784, loss_bbox: 0.2219, loss_mask: 0.2241, loss: 0.6658 2024-06-01 09:32:51,510 - mmdet - INFO - Epoch [9][250/7330] lr: 1.000e-05, eta: 5:18:11, time: 0.645, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0383, loss_cls: 0.1597, acc: 93.9290, loss_bbox: 0.2183, loss_mask: 0.2253, loss: 0.6575 2024-06-01 09:33:23,737 - mmdet - INFO - Epoch [9][300/7330] lr: 1.000e-05, eta: 5:17:38, time: 0.645, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0390, loss_cls: 0.1630, acc: 93.8064, loss_bbox: 0.2206, loss_mask: 0.2261, loss: 0.6638 2024-06-01 09:33:57,897 - mmdet - INFO - Epoch [9][350/7330] lr: 1.000e-05, eta: 5:17:05, time: 0.683, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0414, loss_cls: 0.1647, acc: 93.7671, loss_bbox: 0.2125, loss_mask: 0.2209, loss: 0.6554 2024-06-01 09:34:30,278 - mmdet - INFO - Epoch [9][400/7330] lr: 1.000e-05, eta: 5:16:32, time: 0.648, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0399, loss_cls: 0.1715, acc: 93.5583, loss_bbox: 0.2223, loss_mask: 0.2257, loss: 0.6763 2024-06-01 09:35:02,144 - mmdet - INFO - Epoch [9][450/7330] lr: 1.000e-05, eta: 5:15:59, time: 0.637, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0357, loss_cls: 0.1556, acc: 94.0435, loss_bbox: 0.2072, loss_mask: 0.2211, loss: 0.6339 2024-06-01 09:35:34,302 - mmdet - INFO - Epoch [9][500/7330] lr: 1.000e-05, eta: 5:15:26, time: 0.643, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0394, loss_cls: 0.1674, acc: 93.6008, loss_bbox: 0.2231, loss_mask: 0.2265, loss: 0.6725 2024-06-01 09:36:06,531 - mmdet - INFO - Epoch [9][550/7330] lr: 1.000e-05, eta: 5:14:53, time: 0.645, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0348, loss_cls: 0.1514, acc: 94.2764, loss_bbox: 0.2071, loss_mask: 0.2207, loss: 0.6286 2024-06-01 09:36:40,753 - mmdet - INFO - Epoch [9][600/7330] lr: 1.000e-05, eta: 5:14:21, time: 0.684, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0376, loss_cls: 0.1626, acc: 93.8823, loss_bbox: 0.2144, loss_mask: 0.2284, loss: 0.6586 2024-06-01 09:37:12,691 - mmdet - INFO - Epoch [9][650/7330] lr: 1.000e-05, eta: 5:13:47, time: 0.639, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0377, loss_cls: 0.1624, acc: 93.8799, loss_bbox: 0.2177, loss_mask: 0.2275, loss: 0.6608 2024-06-01 09:37:44,989 - mmdet - INFO - Epoch [9][700/7330] lr: 1.000e-05, eta: 5:13:14, time: 0.646, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0382, loss_cls: 0.1599, acc: 93.8706, loss_bbox: 0.2164, loss_mask: 0.2203, loss: 0.6510 2024-06-01 09:38:19,667 - mmdet - INFO - Epoch [9][750/7330] lr: 1.000e-05, eta: 5:12:42, time: 0.694, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0392, loss_cls: 0.1606, acc: 93.8911, loss_bbox: 0.2115, loss_mask: 0.2221, loss: 0.6502 2024-06-01 09:38:51,657 - mmdet - INFO - Epoch [9][800/7330] lr: 1.000e-05, eta: 5:12:09, time: 0.640, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0365, loss_cls: 0.1556, acc: 94.0986, loss_bbox: 0.2114, loss_mask: 0.2188, loss: 0.6371 2024-06-01 09:39:26,215 - mmdet - INFO - Epoch [9][850/7330] lr: 1.000e-05, eta: 5:11:37, time: 0.691, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0375, loss_cls: 0.1541, acc: 94.0376, loss_bbox: 0.2097, loss_mask: 0.2164, loss: 0.6329 2024-06-01 09:39:58,344 - mmdet - INFO - Epoch [9][900/7330] lr: 1.000e-05, eta: 5:11:04, time: 0.642, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0398, loss_cls: 0.1601, acc: 93.8779, loss_bbox: 0.2187, loss_mask: 0.2214, loss: 0.6552 2024-06-01 09:40:30,469 - mmdet - INFO - Epoch [9][950/7330] lr: 1.000e-05, eta: 5:10:31, time: 0.643, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0391, loss_cls: 0.1574, acc: 93.9409, loss_bbox: 0.2113, loss_mask: 0.2200, loss: 0.6436 2024-06-01 09:41:02,117 - mmdet - INFO - Epoch [9][1000/7330] lr: 1.000e-05, eta: 5:09:57, time: 0.633, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0359, loss_cls: 0.1608, acc: 93.8550, loss_bbox: 0.2143, loss_mask: 0.2226, loss: 0.6479 2024-06-01 09:41:34,376 - mmdet - INFO - Epoch [9][1050/7330] lr: 1.000e-05, eta: 5:09:24, time: 0.645, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0397, loss_cls: 0.1648, acc: 93.7219, loss_bbox: 0.2200, loss_mask: 0.2264, loss: 0.6675 2024-06-01 09:42:06,600 - mmdet - INFO - Epoch [9][1100/7330] lr: 1.000e-05, eta: 5:08:51, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0375, loss_cls: 0.1609, acc: 93.9324, loss_bbox: 0.2128, loss_mask: 0.2252, loss: 0.6507 2024-06-01 09:42:38,921 - mmdet - INFO - Epoch [9][1150/7330] lr: 1.000e-05, eta: 5:08:18, time: 0.647, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0386, loss_cls: 0.1594, acc: 93.9092, loss_bbox: 0.2142, loss_mask: 0.2195, loss: 0.6462 2024-06-01 09:43:14,470 - mmdet - INFO - Epoch [9][1200/7330] lr: 1.000e-05, eta: 5:07:46, time: 0.711, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0391, loss_cls: 0.1607, acc: 93.8230, loss_bbox: 0.2134, loss_mask: 0.2246, loss: 0.6524 2024-06-01 09:43:46,737 - mmdet - INFO - Epoch [9][1250/7330] lr: 1.000e-05, eta: 5:07:13, time: 0.645, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0384, loss_cls: 0.1599, acc: 93.9031, loss_bbox: 0.2140, loss_mask: 0.2212, loss: 0.6483 2024-06-01 09:44:19,155 - mmdet - INFO - Epoch [9][1300/7330] lr: 1.000e-05, eta: 5:06:40, time: 0.648, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0380, loss_cls: 0.1599, acc: 94.0129, loss_bbox: 0.2166, loss_mask: 0.2201, loss: 0.6488 2024-06-01 09:44:56,364 - mmdet - INFO - Epoch [9][1350/7330] lr: 1.000e-05, eta: 5:06:10, time: 0.744, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0370, loss_cls: 0.1521, acc: 94.2698, loss_bbox: 0.2069, loss_mask: 0.2169, loss: 0.6265 2024-06-01 09:45:33,202 - mmdet - INFO - Epoch [9][1400/7330] lr: 1.000e-05, eta: 5:05:39, time: 0.737, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0362, loss_cls: 0.1585, acc: 93.9399, loss_bbox: 0.2108, loss_mask: 0.2224, loss: 0.6425 2024-06-01 09:46:05,396 - mmdet - INFO - Epoch [9][1450/7330] lr: 1.000e-05, eta: 5:05:05, time: 0.644, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0364, loss_cls: 0.1546, acc: 94.0916, loss_bbox: 0.2065, loss_mask: 0.2127, loss: 0.6247 2024-06-01 09:46:37,637 - mmdet - INFO - Epoch [9][1500/7330] lr: 1.000e-05, eta: 5:04:32, time: 0.645, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0364, loss_cls: 0.1582, acc: 94.0591, loss_bbox: 0.2095, loss_mask: 0.2260, loss: 0.6440 2024-06-01 09:47:09,629 - mmdet - INFO - Epoch [9][1550/7330] lr: 1.000e-05, eta: 5:03:59, time: 0.640, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0362, loss_cls: 0.1524, acc: 94.1089, loss_bbox: 0.2061, loss_mask: 0.2158, loss: 0.6241 2024-06-01 09:47:41,410 - mmdet - INFO - Epoch [9][1600/7330] lr: 1.000e-05, eta: 5:03:26, time: 0.636, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0366, loss_cls: 0.1513, acc: 94.1599, loss_bbox: 0.2056, loss_mask: 0.2174, loss: 0.6254 2024-06-01 09:48:16,444 - mmdet - INFO - Epoch [9][1650/7330] lr: 1.000e-05, eta: 5:02:54, time: 0.701, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0358, loss_cls: 0.1516, acc: 94.1602, loss_bbox: 0.2075, loss_mask: 0.2209, loss: 0.6289 2024-06-01 09:48:48,687 - mmdet - INFO - Epoch [9][1700/7330] lr: 1.000e-05, eta: 5:02:21, time: 0.645, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0356, loss_cls: 0.1538, acc: 94.2456, loss_bbox: 0.2057, loss_mask: 0.2148, loss: 0.6245 2024-06-01 09:49:23,065 - mmdet - INFO - Epoch [9][1750/7330] lr: 1.000e-05, eta: 5:01:49, time: 0.687, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0374, loss_cls: 0.1616, acc: 93.9021, loss_bbox: 0.2152, loss_mask: 0.2210, loss: 0.6504 2024-06-01 09:49:55,315 - mmdet - INFO - Epoch [9][1800/7330] lr: 1.000e-05, eta: 5:01:16, time: 0.645, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0383, loss_cls: 0.1602, acc: 93.8191, loss_bbox: 0.2138, loss_mask: 0.2205, loss: 0.6476 2024-06-01 09:50:27,777 - mmdet - INFO - Epoch [9][1850/7330] lr: 1.000e-05, eta: 5:00:42, time: 0.649, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0402, loss_cls: 0.1679, acc: 93.5876, loss_bbox: 0.2206, loss_mask: 0.2222, loss: 0.6651 2024-06-01 09:51:02,264 - mmdet - INFO - Epoch [9][1900/7330] lr: 1.000e-05, eta: 5:00:10, time: 0.690, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0389, loss_cls: 0.1603, acc: 93.9221, loss_bbox: 0.2152, loss_mask: 0.2231, loss: 0.6531 2024-06-01 09:51:34,312 - mmdet - INFO - Epoch [9][1950/7330] lr: 1.000e-05, eta: 4:59:37, time: 0.641, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0351, loss_cls: 0.1555, acc: 94.1785, loss_bbox: 0.2068, loss_mask: 0.2151, loss: 0.6255 2024-06-01 09:52:06,448 - mmdet - INFO - Epoch [9][2000/7330] lr: 1.000e-05, eta: 4:59:04, time: 0.643, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0354, loss_cls: 0.1499, acc: 94.2571, loss_bbox: 0.2046, loss_mask: 0.2141, loss: 0.6169 2024-06-01 09:52:38,485 - mmdet - INFO - Epoch [9][2050/7330] lr: 1.000e-05, eta: 4:58:31, time: 0.641, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0373, loss_cls: 0.1558, acc: 94.1167, loss_bbox: 0.2101, loss_mask: 0.2236, loss: 0.6422 2024-06-01 09:53:10,622 - mmdet - INFO - Epoch [9][2100/7330] lr: 1.000e-05, eta: 4:57:58, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0368, loss_cls: 0.1527, acc: 94.1670, loss_bbox: 0.2043, loss_mask: 0.2196, loss: 0.6289 2024-06-01 09:53:42,752 - mmdet - INFO - Epoch [9][2150/7330] lr: 1.000e-05, eta: 4:57:25, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0364, loss_cls: 0.1509, acc: 94.2676, loss_bbox: 0.2024, loss_mask: 0.2162, loss: 0.6195 2024-06-01 09:54:14,846 - mmdet - INFO - Epoch [9][2200/7330] lr: 1.000e-05, eta: 4:56:51, time: 0.642, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0365, loss_cls: 0.1464, acc: 94.3447, loss_bbox: 0.2035, loss_mask: 0.2181, loss: 0.6188 2024-06-01 09:54:49,790 - mmdet - INFO - Epoch [9][2250/7330] lr: 1.000e-05, eta: 4:56:19, time: 0.699, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0356, loss_cls: 0.1524, acc: 94.1716, loss_bbox: 0.2108, loss_mask: 0.2194, loss: 0.6328 2024-06-01 09:55:22,106 - mmdet - INFO - Epoch [9][2300/7330] lr: 1.000e-05, eta: 4:55:46, time: 0.646, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0369, loss_cls: 0.1567, acc: 93.9517, loss_bbox: 0.2097, loss_mask: 0.2211, loss: 0.6391 2024-06-01 09:55:54,644 - mmdet - INFO - Epoch [9][2350/7330] lr: 1.000e-05, eta: 4:55:13, time: 0.651, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0396, loss_cls: 0.1638, acc: 93.7295, loss_bbox: 0.2225, loss_mask: 0.2231, loss: 0.6647 2024-06-01 09:56:31,575 - mmdet - INFO - Epoch [9][2400/7330] lr: 1.000e-05, eta: 4:54:42, time: 0.739, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0375, loss_cls: 0.1579, acc: 93.9734, loss_bbox: 0.2124, loss_mask: 0.2228, loss: 0.6453 2024-06-01 09:57:08,146 - mmdet - INFO - Epoch [9][2450/7330] lr: 1.000e-05, eta: 4:54:11, time: 0.731, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0359, loss_cls: 0.1495, acc: 94.3271, loss_bbox: 0.2059, loss_mask: 0.2179, loss: 0.6227 2024-06-01 09:57:40,644 - mmdet - INFO - Epoch [9][2500/7330] lr: 1.000e-05, eta: 4:53:38, time: 0.650, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0378, loss_cls: 0.1566, acc: 94.0352, loss_bbox: 0.2076, loss_mask: 0.2199, loss: 0.6363 2024-06-01 09:58:12,780 - mmdet - INFO - Epoch [9][2550/7330] lr: 1.000e-05, eta: 4:53:05, time: 0.643, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0355, loss_cls: 0.1522, acc: 94.2729, loss_bbox: 0.2045, loss_mask: 0.2202, loss: 0.6264 2024-06-01 09:58:44,760 - mmdet - INFO - Epoch [9][2600/7330] lr: 1.000e-05, eta: 4:52:32, time: 0.640, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0380, loss_cls: 0.1596, acc: 93.8313, loss_bbox: 0.2144, loss_mask: 0.2243, loss: 0.6511 2024-06-01 09:59:17,077 - mmdet - INFO - Epoch [9][2650/7330] lr: 1.000e-05, eta: 4:51:59, time: 0.646, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0376, loss_cls: 0.1549, acc: 94.0681, loss_bbox: 0.2138, loss_mask: 0.2171, loss: 0.6388 2024-06-01 09:59:52,471 - mmdet - INFO - Epoch [9][2700/7330] lr: 1.000e-05, eta: 4:51:27, time: 0.708, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0358, loss_cls: 0.1611, acc: 93.8748, loss_bbox: 0.2152, loss_mask: 0.2168, loss: 0.6439 2024-06-01 10:00:24,628 - mmdet - INFO - Epoch [9][2750/7330] lr: 1.000e-05, eta: 4:50:54, time: 0.643, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0355, loss_cls: 0.1515, acc: 94.3000, loss_bbox: 0.2022, loss_mask: 0.2141, loss: 0.6178 2024-06-01 10:00:56,818 - mmdet - INFO - Epoch [9][2800/7330] lr: 1.000e-05, eta: 4:50:21, time: 0.644, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0372, loss_cls: 0.1628, acc: 93.7427, loss_bbox: 0.2144, loss_mask: 0.2232, loss: 0.6534 2024-06-01 10:01:31,247 - mmdet - INFO - Epoch [9][2850/7330] lr: 1.000e-05, eta: 4:49:49, time: 0.689, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0380, loss_cls: 0.1603, acc: 93.8489, loss_bbox: 0.2152, loss_mask: 0.2217, loss: 0.6491 2024-06-01 10:02:03,344 - mmdet - INFO - Epoch [9][2900/7330] lr: 1.000e-05, eta: 4:49:15, time: 0.642, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0377, loss_cls: 0.1549, acc: 94.1736, loss_bbox: 0.2115, loss_mask: 0.2236, loss: 0.6412 2024-06-01 10:02:38,026 - mmdet - INFO - Epoch [9][2950/7330] lr: 1.000e-05, eta: 4:48:43, time: 0.694, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0375, loss_cls: 0.1594, acc: 93.9465, loss_bbox: 0.2107, loss_mask: 0.2206, loss: 0.6439 2024-06-01 10:03:10,147 - mmdet - INFO - Epoch [9][3000/7330] lr: 1.000e-05, eta: 4:48:10, time: 0.642, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0394, loss_cls: 0.1631, acc: 93.7644, loss_bbox: 0.2184, loss_mask: 0.2246, loss: 0.6609 2024-06-01 10:03:42,475 - mmdet - INFO - Epoch [9][3050/7330] lr: 1.000e-05, eta: 4:47:37, time: 0.647, data_time: 0.045, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0371, loss_cls: 0.1561, acc: 94.1160, loss_bbox: 0.2095, loss_mask: 0.2179, loss: 0.6341 2024-06-01 10:04:14,286 - mmdet - INFO - Epoch [9][3100/7330] lr: 1.000e-05, eta: 4:47:04, time: 0.636, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0349, loss_cls: 0.1440, acc: 94.5122, loss_bbox: 0.1921, loss_mask: 0.2077, loss: 0.5920 2024-06-01 10:04:46,733 - mmdet - INFO - Epoch [9][3150/7330] lr: 1.000e-05, eta: 4:46:31, time: 0.649, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0395, loss_cls: 0.1599, acc: 93.9202, loss_bbox: 0.2125, loss_mask: 0.2192, loss: 0.6460 2024-06-01 10:05:18,902 - mmdet - INFO - Epoch [9][3200/7330] lr: 1.000e-05, eta: 4:45:58, time: 0.643, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0389, loss_cls: 0.1569, acc: 93.9795, loss_bbox: 0.2119, loss_mask: 0.2171, loss: 0.6399 2024-06-01 10:05:51,076 - mmdet - INFO - Epoch [9][3250/7330] lr: 1.000e-05, eta: 4:45:25, time: 0.644, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0373, loss_cls: 0.1537, acc: 94.0886, loss_bbox: 0.2095, loss_mask: 0.2144, loss: 0.6291 2024-06-01 10:06:26,318 - mmdet - INFO - Epoch [9][3300/7330] lr: 1.000e-05, eta: 4:44:53, time: 0.705, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0358, loss_cls: 0.1518, acc: 94.2166, loss_bbox: 0.2089, loss_mask: 0.2163, loss: 0.6259 2024-06-01 10:06:58,694 - mmdet - INFO - Epoch [9][3350/7330] lr: 1.000e-05, eta: 4:44:20, time: 0.647, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0363, loss_cls: 0.1515, acc: 94.2083, loss_bbox: 0.2055, loss_mask: 0.2153, loss: 0.6217 2024-06-01 10:07:31,374 - mmdet - INFO - Epoch [9][3400/7330] lr: 1.000e-05, eta: 4:43:47, time: 0.654, data_time: 0.041, memory: 13277, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0392, loss_cls: 0.1613, acc: 93.8975, loss_bbox: 0.2188, loss_mask: 0.2231, loss: 0.6575 2024-06-01 10:08:08,791 - mmdet - INFO - Epoch [9][3450/7330] lr: 1.000e-05, eta: 4:43:16, time: 0.748, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0377, loss_cls: 0.1512, acc: 94.1685, loss_bbox: 0.2058, loss_mask: 0.2180, loss: 0.6268 2024-06-01 10:08:45,723 - mmdet - INFO - Epoch [9][3500/7330] lr: 1.000e-05, eta: 4:42:45, time: 0.739, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0350, loss_cls: 0.1581, acc: 93.9814, loss_bbox: 0.2144, loss_mask: 0.2195, loss: 0.6415 2024-06-01 10:09:17,704 - mmdet - INFO - Epoch [9][3550/7330] lr: 1.000e-05, eta: 4:42:11, time: 0.640, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0358, loss_cls: 0.1479, acc: 94.4282, loss_bbox: 0.1960, loss_mask: 0.2116, loss: 0.6044 2024-06-01 10:09:49,900 - mmdet - INFO - Epoch [9][3600/7330] lr: 1.000e-05, eta: 4:41:38, time: 0.644, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0375, loss_cls: 0.1641, acc: 93.7910, loss_bbox: 0.2186, loss_mask: 0.2238, loss: 0.6588 2024-06-01 10:10:22,137 - mmdet - INFO - Epoch [9][3650/7330] lr: 1.000e-05, eta: 4:41:05, time: 0.645, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0368, loss_cls: 0.1542, acc: 94.1753, loss_bbox: 0.2075, loss_mask: 0.2172, loss: 0.6303 2024-06-01 10:10:54,430 - mmdet - INFO - Epoch [9][3700/7330] lr: 1.000e-05, eta: 4:40:32, time: 0.646, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0381, loss_cls: 0.1527, acc: 94.1489, loss_bbox: 0.2124, loss_mask: 0.2207, loss: 0.6385 2024-06-01 10:11:30,466 - mmdet - INFO - Epoch [9][3750/7330] lr: 1.000e-05, eta: 4:40:01, time: 0.720, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0356, loss_cls: 0.1535, acc: 94.1409, loss_bbox: 0.2078, loss_mask: 0.2180, loss: 0.6294 2024-06-01 10:12:02,815 - mmdet - INFO - Epoch [9][3800/7330] lr: 1.000e-05, eta: 4:39:27, time: 0.647, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0382, loss_cls: 0.1610, acc: 93.8882, loss_bbox: 0.2133, loss_mask: 0.2231, loss: 0.6512 2024-06-01 10:12:35,270 - mmdet - INFO - Epoch [9][3850/7330] lr: 1.000e-05, eta: 4:38:54, time: 0.649, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0388, loss_cls: 0.1607, acc: 93.9050, loss_bbox: 0.2155, loss_mask: 0.2179, loss: 0.6486 2024-06-01 10:13:09,732 - mmdet - INFO - Epoch [9][3900/7330] lr: 1.000e-05, eta: 4:38:22, time: 0.689, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0416, loss_cls: 0.1647, acc: 93.7651, loss_bbox: 0.2168, loss_mask: 0.2201, loss: 0.6597 2024-06-01 10:13:41,967 - mmdet - INFO - Epoch [9][3950/7330] lr: 1.000e-05, eta: 4:37:49, time: 0.645, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0397, loss_cls: 0.1582, acc: 94.0276, loss_bbox: 0.2121, loss_mask: 0.2214, loss: 0.6458 2024-06-01 10:14:16,951 - mmdet - INFO - Epoch [9][4000/7330] lr: 1.000e-05, eta: 4:37:17, time: 0.700, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0377, loss_cls: 0.1510, acc: 94.2681, loss_bbox: 0.2116, loss_mask: 0.2230, loss: 0.6380 2024-06-01 10:14:49,247 - mmdet - INFO - Epoch [9][4050/7330] lr: 1.000e-05, eta: 4:36:44, time: 0.646, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0345, loss_cls: 0.1487, acc: 94.2432, loss_bbox: 0.2032, loss_mask: 0.2189, loss: 0.6185 2024-06-01 10:15:21,498 - mmdet - INFO - Epoch [9][4100/7330] lr: 1.000e-05, eta: 4:36:11, time: 0.645, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0359, loss_cls: 0.1525, acc: 94.2612, loss_bbox: 0.2037, loss_mask: 0.2123, loss: 0.6182 2024-06-01 10:15:53,745 - mmdet - INFO - Epoch [9][4150/7330] lr: 1.000e-05, eta: 4:35:38, time: 0.645, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0377, loss_cls: 0.1584, acc: 93.9531, loss_bbox: 0.2135, loss_mask: 0.2187, loss: 0.6437 2024-06-01 10:16:25,863 - mmdet - INFO - Epoch [9][4200/7330] lr: 1.000e-05, eta: 4:35:05, time: 0.642, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0386, loss_cls: 0.1582, acc: 93.9446, loss_bbox: 0.2123, loss_mask: 0.2245, loss: 0.6477 2024-06-01 10:16:58,015 - mmdet - INFO - Epoch [9][4250/7330] lr: 1.000e-05, eta: 4:34:32, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0360, loss_cls: 0.1534, acc: 94.1550, loss_bbox: 0.2091, loss_mask: 0.2210, loss: 0.6326 2024-06-01 10:17:30,010 - mmdet - INFO - Epoch [9][4300/7330] lr: 1.000e-05, eta: 4:33:58, time: 0.640, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0370, loss_cls: 0.1540, acc: 94.1121, loss_bbox: 0.2043, loss_mask: 0.2186, loss: 0.6277 2024-06-01 10:18:05,401 - mmdet - INFO - Epoch [9][4350/7330] lr: 1.000e-05, eta: 4:33:27, time: 0.708, data_time: 0.039, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0378, loss_cls: 0.1581, acc: 93.9956, loss_bbox: 0.2126, loss_mask: 0.2209, loss: 0.6428 2024-06-01 10:18:37,564 - mmdet - INFO - Epoch [9][4400/7330] lr: 1.000e-05, eta: 4:32:53, time: 0.643, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0383, loss_cls: 0.1581, acc: 93.8931, loss_bbox: 0.2147, loss_mask: 0.2239, loss: 0.6499 2024-06-01 10:19:09,651 - mmdet - INFO - Epoch [9][4450/7330] lr: 1.000e-05, eta: 4:32:20, time: 0.642, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0362, loss_cls: 0.1538, acc: 94.0510, loss_bbox: 0.2090, loss_mask: 0.2170, loss: 0.6290 2024-06-01 10:19:46,518 - mmdet - INFO - Epoch [9][4500/7330] lr: 1.000e-05, eta: 4:31:49, time: 0.737, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0378, loss_cls: 0.1594, acc: 93.8940, loss_bbox: 0.2127, loss_mask: 0.2155, loss: 0.6389 2024-06-01 10:20:22,624 - mmdet - INFO - Epoch [9][4550/7330] lr: 1.000e-05, eta: 4:31:17, time: 0.722, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0336, loss_cls: 0.1488, acc: 94.4290, loss_bbox: 0.2019, loss_mask: 0.2153, loss: 0.6126 2024-06-01 10:20:54,597 - mmdet - INFO - Epoch [9][4600/7330] lr: 1.000e-05, eta: 4:30:44, time: 0.639, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0341, loss_cls: 0.1456, acc: 94.4565, loss_bbox: 0.1976, loss_mask: 0.2130, loss: 0.6040 2024-06-01 10:21:26,919 - mmdet - INFO - Epoch [9][4650/7330] lr: 1.000e-05, eta: 4:30:11, time: 0.646, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0390, loss_cls: 0.1558, acc: 94.1033, loss_bbox: 0.2107, loss_mask: 0.2224, loss: 0.6434 2024-06-01 10:21:59,126 - mmdet - INFO - Epoch [9][4700/7330] lr: 1.000e-05, eta: 4:29:38, time: 0.644, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0360, loss_cls: 0.1472, acc: 94.4558, loss_bbox: 0.2029, loss_mask: 0.2152, loss: 0.6149 2024-06-01 10:22:31,006 - mmdet - INFO - Epoch [9][4750/7330] lr: 1.000e-05, eta: 4:29:05, time: 0.638, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0360, loss_cls: 0.1476, acc: 94.4204, loss_bbox: 0.1991, loss_mask: 0.2162, loss: 0.6127 2024-06-01 10:23:06,173 - mmdet - INFO - Epoch [9][4800/7330] lr: 1.000e-05, eta: 4:28:33, time: 0.703, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0374, loss_cls: 0.1500, acc: 94.2056, loss_bbox: 0.2086, loss_mask: 0.2171, loss: 0.6275 2024-06-01 10:23:38,173 - mmdet - INFO - Epoch [9][4850/7330] lr: 1.000e-05, eta: 4:28:00, time: 0.640, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0369, loss_cls: 0.1483, acc: 94.3306, loss_bbox: 0.2026, loss_mask: 0.2166, loss: 0.6188 2024-06-01 10:24:10,587 - mmdet - INFO - Epoch [9][4900/7330] lr: 1.000e-05, eta: 4:27:27, time: 0.648, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0400, loss_cls: 0.1584, acc: 93.9541, loss_bbox: 0.2186, loss_mask: 0.2245, loss: 0.6566 2024-06-01 10:24:44,631 - mmdet - INFO - Epoch [9][4950/7330] lr: 1.000e-05, eta: 4:26:54, time: 0.681, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0363, loss_cls: 0.1524, acc: 94.1611, loss_bbox: 0.2065, loss_mask: 0.2148, loss: 0.6235 2024-06-01 10:25:16,912 - mmdet - INFO - Epoch [9][5000/7330] lr: 1.000e-05, eta: 4:26:21, time: 0.646, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0362, loss_cls: 0.1540, acc: 94.1367, loss_bbox: 0.2065, loss_mask: 0.2192, loss: 0.6304 2024-06-01 10:25:51,145 - mmdet - INFO - Epoch [9][5050/7330] lr: 1.000e-05, eta: 4:25:49, time: 0.685, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0359, loss_cls: 0.1516, acc: 94.1667, loss_bbox: 0.2034, loss_mask: 0.2208, loss: 0.6261 2024-06-01 10:26:23,204 - mmdet - INFO - Epoch [9][5100/7330] lr: 1.000e-05, eta: 4:25:16, time: 0.641, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0343, loss_cls: 0.1530, acc: 94.1606, loss_bbox: 0.2046, loss_mask: 0.2125, loss: 0.6166 2024-06-01 10:26:55,003 - mmdet - INFO - Epoch [9][5150/7330] lr: 1.000e-05, eta: 4:24:42, time: 0.636, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0354, loss_cls: 0.1560, acc: 94.0156, loss_bbox: 0.2118, loss_mask: 0.2238, loss: 0.6404 2024-06-01 10:27:27,471 - mmdet - INFO - Epoch [9][5200/7330] lr: 1.000e-05, eta: 4:24:09, time: 0.649, data_time: 0.039, memory: 13277, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0406, loss_cls: 0.1684, acc: 93.5659, loss_bbox: 0.2278, loss_mask: 0.2297, loss: 0.6829 2024-06-01 10:27:59,992 - mmdet - INFO - Epoch [9][5250/7330] lr: 1.000e-05, eta: 4:23:36, time: 0.650, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0355, loss_cls: 0.1558, acc: 94.0435, loss_bbox: 0.2077, loss_mask: 0.2189, loss: 0.6320 2024-06-01 10:28:32,387 - mmdet - INFO - Epoch [9][5300/7330] lr: 1.000e-05, eta: 4:23:03, time: 0.648, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0355, loss_cls: 0.1494, acc: 94.3191, loss_bbox: 0.2047, loss_mask: 0.2205, loss: 0.6235 2024-06-01 10:29:04,625 - mmdet - INFO - Epoch [9][5350/7330] lr: 1.000e-05, eta: 4:22:30, time: 0.645, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0351, loss_cls: 0.1498, acc: 94.3328, loss_bbox: 0.2002, loss_mask: 0.2125, loss: 0.6107 2024-06-01 10:29:39,502 - mmdet - INFO - Epoch [9][5400/7330] lr: 1.000e-05, eta: 4:21:58, time: 0.698, data_time: 0.041, memory: 13277, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0379, loss_cls: 0.1536, acc: 94.1719, loss_bbox: 0.2021, loss_mask: 0.2109, loss: 0.6192 2024-06-01 10:30:12,027 - mmdet - INFO - Epoch [9][5450/7330] lr: 1.000e-05, eta: 4:21:25, time: 0.650, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0404, loss_cls: 0.1594, acc: 93.8923, loss_bbox: 0.2129, loss_mask: 0.2166, loss: 0.6448 2024-06-01 10:30:44,136 - mmdet - INFO - Epoch [9][5500/7330] lr: 1.000e-05, eta: 4:20:52, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0364, loss_cls: 0.1509, acc: 94.2319, loss_bbox: 0.2029, loss_mask: 0.2194, loss: 0.6240 2024-06-01 10:31:20,894 - mmdet - INFO - Epoch [9][5550/7330] lr: 1.000e-05, eta: 4:20:21, time: 0.735, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0382, loss_cls: 0.1523, acc: 94.2610, loss_bbox: 0.2082, loss_mask: 0.2195, loss: 0.6321 2024-06-01 10:31:57,473 - mmdet - INFO - Epoch [9][5600/7330] lr: 1.000e-05, eta: 4:19:49, time: 0.732, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0390, loss_cls: 0.1610, acc: 93.9048, loss_bbox: 0.2174, loss_mask: 0.2224, loss: 0.6553 2024-06-01 10:32:29,672 - mmdet - INFO - Epoch [9][5650/7330] lr: 1.000e-05, eta: 4:19:16, time: 0.644, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0351, loss_cls: 0.1527, acc: 94.1348, loss_bbox: 0.2082, loss_mask: 0.2168, loss: 0.6257 2024-06-01 10:33:01,812 - mmdet - INFO - Epoch [9][5700/7330] lr: 1.000e-05, eta: 4:18:43, time: 0.643, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0371, loss_cls: 0.1484, acc: 94.2568, loss_bbox: 0.2007, loss_mask: 0.2154, loss: 0.6158 2024-06-01 10:33:33,968 - mmdet - INFO - Epoch [9][5750/7330] lr: 1.000e-05, eta: 4:18:10, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0366, loss_cls: 0.1536, acc: 94.1301, loss_bbox: 0.2078, loss_mask: 0.2194, loss: 0.6321 2024-06-01 10:34:05,972 - mmdet - INFO - Epoch [9][5800/7330] lr: 1.000e-05, eta: 4:17:37, time: 0.640, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0358, loss_cls: 0.1484, acc: 94.3440, loss_bbox: 0.2009, loss_mask: 0.2152, loss: 0.6129 2024-06-01 10:34:41,195 - mmdet - INFO - Epoch [9][5850/7330] lr: 1.000e-05, eta: 4:17:05, time: 0.705, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0357, loss_cls: 0.1556, acc: 94.1443, loss_bbox: 0.2069, loss_mask: 0.2174, loss: 0.6296 2024-06-01 10:35:13,544 - mmdet - INFO - Epoch [9][5900/7330] lr: 1.000e-05, eta: 4:16:32, time: 0.647, data_time: 0.041, memory: 13277, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0351, loss_cls: 0.1523, acc: 94.2759, loss_bbox: 0.1997, loss_mask: 0.2113, loss: 0.6124 2024-06-01 10:35:45,947 - mmdet - INFO - Epoch [9][5950/7330] lr: 1.000e-05, eta: 4:15:59, time: 0.648, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0360, loss_cls: 0.1508, acc: 94.2791, loss_bbox: 0.2060, loss_mask: 0.2172, loss: 0.6234 2024-06-01 10:36:20,365 - mmdet - INFO - Epoch [9][6000/7330] lr: 1.000e-05, eta: 4:15:26, time: 0.688, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0366, loss_cls: 0.1544, acc: 94.0437, loss_bbox: 0.2102, loss_mask: 0.2153, loss: 0.6294 2024-06-01 10:36:52,631 - mmdet - INFO - Epoch [9][6050/7330] lr: 1.000e-05, eta: 4:14:53, time: 0.645, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0370, loss_cls: 0.1499, acc: 94.2947, loss_bbox: 0.2073, loss_mask: 0.2156, loss: 0.6227 2024-06-01 10:37:27,146 - mmdet - INFO - Epoch [9][6100/7330] lr: 1.000e-05, eta: 4:14:21, time: 0.690, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0368, loss_cls: 0.1517, acc: 94.1934, loss_bbox: 0.2085, loss_mask: 0.2202, loss: 0.6314 2024-06-01 10:37:59,229 - mmdet - INFO - Epoch [9][6150/7330] lr: 1.000e-05, eta: 4:13:48, time: 0.642, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0360, loss_cls: 0.1521, acc: 94.2012, loss_bbox: 0.2051, loss_mask: 0.2161, loss: 0.6238 2024-06-01 10:38:31,417 - mmdet - INFO - Epoch [9][6200/7330] lr: 1.000e-05, eta: 4:13:15, time: 0.644, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0384, loss_cls: 0.1595, acc: 93.9688, loss_bbox: 0.2112, loss_mask: 0.2261, loss: 0.6504 2024-06-01 10:39:03,796 - mmdet - INFO - Epoch [9][6250/7330] lr: 1.000e-05, eta: 4:12:42, time: 0.648, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0350, loss_cls: 0.1461, acc: 94.3215, loss_bbox: 0.2011, loss_mask: 0.2152, loss: 0.6109 2024-06-01 10:39:35,939 - mmdet - INFO - Epoch [9][6300/7330] lr: 1.000e-05, eta: 4:12:09, time: 0.643, data_time: 0.042, memory: 13277, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0391, loss_cls: 0.1532, acc: 94.0715, loss_bbox: 0.2106, loss_mask: 0.2209, loss: 0.6386 2024-06-01 10:40:08,074 - mmdet - INFO - Epoch [9][6350/7330] lr: 1.000e-05, eta: 4:11:35, time: 0.643, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0362, loss_cls: 0.1549, acc: 94.0901, loss_bbox: 0.2048, loss_mask: 0.2113, loss: 0.6214 2024-06-01 10:40:40,370 - mmdet - INFO - Epoch [9][6400/7330] lr: 1.000e-05, eta: 4:11:02, time: 0.646, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0366, loss_cls: 0.1516, acc: 94.1421, loss_bbox: 0.2104, loss_mask: 0.2238, loss: 0.6358 2024-06-01 10:41:15,040 - mmdet - INFO - Epoch [9][6450/7330] lr: 1.000e-05, eta: 4:10:30, time: 0.693, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0363, loss_cls: 0.1511, acc: 94.2561, loss_bbox: 0.2087, loss_mask: 0.2171, loss: 0.6266 2024-06-01 10:41:47,395 - mmdet - INFO - Epoch [9][6500/7330] lr: 1.000e-05, eta: 4:09:57, time: 0.647, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0358, loss_cls: 0.1496, acc: 94.3037, loss_bbox: 0.2077, loss_mask: 0.2113, loss: 0.6183 2024-06-01 10:42:19,628 - mmdet - INFO - Epoch [9][6550/7330] lr: 1.000e-05, eta: 4:09:24, time: 0.645, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0361, loss_cls: 0.1511, acc: 94.1362, loss_bbox: 0.2095, loss_mask: 0.2197, loss: 0.6289 2024-06-01 10:42:58,020 - mmdet - INFO - Epoch [9][6600/7330] lr: 1.000e-05, eta: 4:08:53, time: 0.768, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0371, loss_cls: 0.1526, acc: 94.1001, loss_bbox: 0.2102, loss_mask: 0.2175, loss: 0.6306 2024-06-01 10:43:34,616 - mmdet - INFO - Epoch [9][6650/7330] lr: 1.000e-05, eta: 4:08:22, time: 0.732, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0379, loss_cls: 0.1542, acc: 94.1687, loss_bbox: 0.2050, loss_mask: 0.2136, loss: 0.6250 2024-06-01 10:44:06,581 - mmdet - INFO - Epoch [9][6700/7330] lr: 1.000e-05, eta: 4:07:48, time: 0.639, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0375, loss_cls: 0.1518, acc: 94.2563, loss_bbox: 0.2051, loss_mask: 0.2196, loss: 0.6288 2024-06-01 10:44:38,873 - mmdet - INFO - Epoch [9][6750/7330] lr: 1.000e-05, eta: 4:07:15, time: 0.645, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0383, loss_cls: 0.1531, acc: 94.2009, loss_bbox: 0.2057, loss_mask: 0.2146, loss: 0.6262 2024-06-01 10:45:10,983 - mmdet - INFO - Epoch [9][6800/7330] lr: 1.000e-05, eta: 4:06:42, time: 0.643, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0361, loss_cls: 0.1540, acc: 94.1472, loss_bbox: 0.2125, loss_mask: 0.2221, loss: 0.6377 2024-06-01 10:45:43,464 - mmdet - INFO - Epoch [9][6850/7330] lr: 1.000e-05, eta: 4:06:09, time: 0.650, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0359, loss_cls: 0.1513, acc: 94.2485, loss_bbox: 0.2078, loss_mask: 0.2195, loss: 0.6296 2024-06-01 10:46:18,628 - mmdet - INFO - Epoch [9][6900/7330] lr: 1.000e-05, eta: 4:05:37, time: 0.703, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0391, loss_cls: 0.1616, acc: 93.8042, loss_bbox: 0.2177, loss_mask: 0.2229, loss: 0.6566 2024-06-01 10:46:50,801 - mmdet - INFO - Epoch [9][6950/7330] lr: 1.000e-05, eta: 4:05:04, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0350, loss_cls: 0.1496, acc: 94.3481, loss_bbox: 0.2012, loss_mask: 0.2159, loss: 0.6162 2024-06-01 10:47:23,020 - mmdet - INFO - Epoch [9][7000/7330] lr: 1.000e-05, eta: 4:04:31, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0356, loss_cls: 0.1534, acc: 94.1465, loss_bbox: 0.2085, loss_mask: 0.2200, loss: 0.6310 2024-06-01 10:47:57,485 - mmdet - INFO - Epoch [9][7050/7330] lr: 1.000e-05, eta: 4:03:59, time: 0.689, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0362, loss_cls: 0.1520, acc: 94.1995, loss_bbox: 0.2031, loss_mask: 0.2158, loss: 0.6206 2024-06-01 10:48:29,477 - mmdet - INFO - Epoch [9][7100/7330] lr: 1.000e-05, eta: 4:03:25, time: 0.640, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0329, loss_cls: 0.1490, acc: 94.3157, loss_bbox: 0.1974, loss_mask: 0.2105, loss: 0.6020 2024-06-01 10:49:04,053 - mmdet - INFO - Epoch [9][7150/7330] lr: 1.000e-05, eta: 4:02:53, time: 0.691, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0367, loss_cls: 0.1549, acc: 94.0203, loss_bbox: 0.2090, loss_mask: 0.2185, loss: 0.6325 2024-06-01 10:49:36,517 - mmdet - INFO - Epoch [9][7200/7330] lr: 1.000e-05, eta: 4:02:20, time: 0.649, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0367, loss_cls: 0.1533, acc: 94.1021, loss_bbox: 0.2072, loss_mask: 0.2130, loss: 0.6249 2024-06-01 10:50:08,801 - mmdet - INFO - Epoch [9][7250/7330] lr: 1.000e-05, eta: 4:01:47, time: 0.646, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0361, loss_cls: 0.1477, acc: 94.3618, loss_bbox: 0.2015, loss_mask: 0.2152, loss: 0.6139 2024-06-01 10:50:41,115 - mmdet - INFO - Epoch [9][7300/7330] lr: 1.000e-05, eta: 4:01:14, time: 0.646, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0379, loss_cls: 0.1550, acc: 94.0879, loss_bbox: 0.2115, loss_mask: 0.2178, loss: 0.6356 2024-06-01 10:51:01,052 - mmdet - INFO - Saving checkpoint at 9 epochs 2024-06-01 10:52:39,528 - mmdet - INFO - Evaluating bbox... 2024-06-01 10:52:58,437 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.479 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.708 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.520 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.291 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.523 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.654 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.588 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.588 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.588 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.388 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.635 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.762 2024-06-01 10:52:58,437 - mmdet - INFO - Evaluating segm... 2024-06-01 10:53:19,860 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.424 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.672 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.450 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.203 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.460 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.646 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.526 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.526 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.526 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.310 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.576 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.720 2024-06-01 10:53:20,147 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 10:53:20,148 - mmdet - INFO - Epoch(val) [9][625] bbox_mAP: 0.4790, bbox_mAP_50: 0.7080, bbox_mAP_75: 0.5200, bbox_mAP_s: 0.2910, bbox_mAP_m: 0.5230, bbox_mAP_l: 0.6540, bbox_mAP_copypaste: 0.479 0.708 0.520 0.291 0.523 0.654, segm_mAP: 0.4240, segm_mAP_50: 0.6720, segm_mAP_75: 0.4500, segm_mAP_s: 0.2030, segm_mAP_m: 0.4600, segm_mAP_l: 0.6460, segm_mAP_copypaste: 0.424 0.672 0.450 0.203 0.460 0.646 2024-06-01 10:54:06,635 - mmdet - INFO - Epoch [10][50/7330] lr: 1.000e-05, eta: 4:00:19, time: 0.929, data_time: 0.099, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0361, loss_cls: 0.1457, acc: 94.4243, loss_bbox: 0.1992, loss_mask: 0.2133, loss: 0.6071 2024-06-01 10:54:38,795 - mmdet - INFO - Epoch [10][100/7330] lr: 1.000e-05, eta: 3:59:46, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0348, loss_cls: 0.1514, acc: 94.2190, loss_bbox: 0.2048, loss_mask: 0.2148, loss: 0.6189 2024-06-01 10:55:10,922 - mmdet - INFO - Epoch [10][150/7330] lr: 1.000e-05, eta: 3:59:13, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0366, loss_cls: 0.1525, acc: 94.1780, loss_bbox: 0.2078, loss_mask: 0.2181, loss: 0.6286 2024-06-01 10:55:43,409 - mmdet - INFO - Epoch [10][200/7330] lr: 1.000e-05, eta: 3:58:40, time: 0.650, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0374, loss_cls: 0.1594, acc: 93.8760, loss_bbox: 0.2181, loss_mask: 0.2239, loss: 0.6536 2024-06-01 10:56:15,464 - mmdet - INFO - Epoch [10][250/7330] lr: 1.000e-05, eta: 3:58:07, time: 0.641, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0373, loss_cls: 0.1514, acc: 94.2615, loss_bbox: 0.2059, loss_mask: 0.2152, loss: 0.6244 2024-06-01 10:56:48,321 - mmdet - INFO - Epoch [10][300/7330] lr: 1.000e-05, eta: 3:57:34, time: 0.657, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0386, loss_cls: 0.1571, acc: 93.9800, loss_bbox: 0.2175, loss_mask: 0.2249, loss: 0.6523 2024-06-01 10:57:20,758 - mmdet - INFO - Epoch [10][350/7330] lr: 1.000e-05, eta: 3:57:01, time: 0.649, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0365, loss_cls: 0.1428, acc: 94.5569, loss_bbox: 0.1948, loss_mask: 0.2108, loss: 0.5978 2024-06-01 10:57:55,366 - mmdet - INFO - Epoch [10][400/7330] lr: 1.000e-05, eta: 3:56:29, time: 0.692, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0383, loss_cls: 0.1561, acc: 94.0251, loss_bbox: 0.2099, loss_mask: 0.2169, loss: 0.6350 2024-06-01 10:58:27,533 - mmdet - INFO - Epoch [10][450/7330] lr: 1.000e-05, eta: 3:55:56, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0370, loss_cls: 0.1526, acc: 94.1865, loss_bbox: 0.2082, loss_mask: 0.2187, loss: 0.6310 2024-06-01 10:59:00,027 - mmdet - INFO - Epoch [10][500/7330] lr: 1.000e-05, eta: 3:55:23, time: 0.650, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0389, loss_cls: 0.1579, acc: 93.9719, loss_bbox: 0.2153, loss_mask: 0.2210, loss: 0.6479 2024-06-01 10:59:32,164 - mmdet - INFO - Epoch [10][550/7330] lr: 1.000e-05, eta: 3:54:50, time: 0.643, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0372, loss_cls: 0.1524, acc: 94.1985, loss_bbox: 0.2127, loss_mask: 0.2182, loss: 0.6344 2024-06-01 11:00:04,705 - mmdet - INFO - Epoch [10][600/7330] lr: 1.000e-05, eta: 3:54:17, time: 0.651, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0367, loss_cls: 0.1488, acc: 94.3159, loss_bbox: 0.2024, loss_mask: 0.2153, loss: 0.6165 2024-06-01 11:00:36,789 - mmdet - INFO - Epoch [10][650/7330] lr: 1.000e-05, eta: 3:53:44, time: 0.642, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0364, loss_cls: 0.1462, acc: 94.3472, loss_bbox: 0.2028, loss_mask: 0.2124, loss: 0.6100 2024-06-01 11:01:08,977 - mmdet - INFO - Epoch [10][700/7330] lr: 1.000e-05, eta: 3:53:11, time: 0.644, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0383, loss_cls: 0.1537, acc: 94.0720, loss_bbox: 0.2099, loss_mask: 0.2160, loss: 0.6321 2024-06-01 11:01:41,187 - mmdet - INFO - Epoch [10][750/7330] lr: 1.000e-05, eta: 3:52:38, time: 0.644, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0352, loss_cls: 0.1445, acc: 94.4795, loss_bbox: 0.1963, loss_mask: 0.2125, loss: 0.6017 2024-06-01 11:02:15,570 - mmdet - INFO - Epoch [10][800/7330] lr: 1.000e-05, eta: 3:52:05, time: 0.688, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0360, loss_cls: 0.1499, acc: 94.2671, loss_bbox: 0.2019, loss_mask: 0.2127, loss: 0.6135 2024-06-01 11:02:51,699 - mmdet - INFO - Epoch [10][850/7330] lr: 1.000e-05, eta: 3:51:33, time: 0.723, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0350, loss_cls: 0.1464, acc: 94.3105, loss_bbox: 0.2010, loss_mask: 0.2134, loss: 0.6094 2024-06-01 11:03:23,810 - mmdet - INFO - Epoch [10][900/7330] lr: 1.000e-05, eta: 3:51:00, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0366, loss_cls: 0.1500, acc: 94.2153, loss_bbox: 0.2068, loss_mask: 0.2133, loss: 0.6204 2024-06-01 11:03:59,000 - mmdet - INFO - Epoch [10][950/7330] lr: 1.000e-05, eta: 3:50:28, time: 0.703, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0366, loss_cls: 0.1492, acc: 94.4053, loss_bbox: 0.2015, loss_mask: 0.2135, loss: 0.6132 2024-06-01 11:04:33,397 - mmdet - INFO - Epoch [10][1000/7330] lr: 1.000e-05, eta: 3:49:56, time: 0.688, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0351, loss_cls: 0.1537, acc: 94.0996, loss_bbox: 0.2085, loss_mask: 0.2163, loss: 0.6266 2024-06-01 11:05:05,894 - mmdet - INFO - Epoch [10][1050/7330] lr: 1.000e-05, eta: 3:49:23, time: 0.650, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0374, loss_cls: 0.1509, acc: 94.2295, loss_bbox: 0.2026, loss_mask: 0.2187, loss: 0.6228 2024-06-01 11:05:42,572 - mmdet - INFO - Epoch [10][1100/7330] lr: 1.000e-05, eta: 3:48:51, time: 0.734, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0367, loss_cls: 0.1496, acc: 94.2542, loss_bbox: 0.2019, loss_mask: 0.2137, loss: 0.6159 2024-06-01 11:06:14,856 - mmdet - INFO - Epoch [10][1150/7330] lr: 1.000e-05, eta: 3:48:18, time: 0.646, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0349, loss_cls: 0.1488, acc: 94.2134, loss_bbox: 0.2049, loss_mask: 0.2172, loss: 0.6181 2024-06-01 11:06:47,164 - mmdet - INFO - Epoch [10][1200/7330] lr: 1.000e-05, eta: 3:47:45, time: 0.646, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0372, loss_cls: 0.1506, acc: 94.3494, loss_bbox: 0.2015, loss_mask: 0.2141, loss: 0.6174 2024-06-01 11:07:19,473 - mmdet - INFO - Epoch [10][1250/7330] lr: 1.000e-05, eta: 3:47:12, time: 0.646, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0365, loss_cls: 0.1534, acc: 94.1482, loss_bbox: 0.2119, loss_mask: 0.2179, loss: 0.6330 2024-06-01 11:07:54,319 - mmdet - INFO - Epoch [10][1300/7330] lr: 1.000e-05, eta: 3:46:40, time: 0.697, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0359, loss_cls: 0.1497, acc: 94.2649, loss_bbox: 0.2056, loss_mask: 0.2151, loss: 0.6197 2024-06-01 11:08:26,471 - mmdet - INFO - Epoch [10][1350/7330] lr: 1.000e-05, eta: 3:46:07, time: 0.643, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0357, loss_cls: 0.1522, acc: 94.1187, loss_bbox: 0.2075, loss_mask: 0.2174, loss: 0.6272 2024-06-01 11:08:58,717 - mmdet - INFO - Epoch [10][1400/7330] lr: 1.000e-05, eta: 3:45:34, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0364, loss_cls: 0.1471, acc: 94.4094, loss_bbox: 0.2032, loss_mask: 0.2154, loss: 0.6158 2024-06-01 11:09:33,190 - mmdet - INFO - Epoch [10][1450/7330] lr: 1.000e-05, eta: 3:45:01, time: 0.689, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0357, loss_cls: 0.1552, acc: 94.1150, loss_bbox: 0.2092, loss_mask: 0.2159, loss: 0.6291 2024-06-01 11:10:05,277 - mmdet - INFO - Epoch [10][1500/7330] lr: 1.000e-05, eta: 3:44:28, time: 0.642, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0375, loss_cls: 0.1558, acc: 94.0598, loss_bbox: 0.2093, loss_mask: 0.2236, loss: 0.6408 2024-06-01 11:10:37,493 - mmdet - INFO - Epoch [10][1550/7330] lr: 1.000e-05, eta: 3:43:55, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0367, loss_cls: 0.1479, acc: 94.3843, loss_bbox: 0.2035, loss_mask: 0.2128, loss: 0.6143 2024-06-01 11:11:09,464 - mmdet - INFO - Epoch [10][1600/7330] lr: 1.000e-05, eta: 3:43:22, time: 0.639, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0358, loss_cls: 0.1479, acc: 94.3503, loss_bbox: 0.2010, loss_mask: 0.2168, loss: 0.6146 2024-06-01 11:11:41,605 - mmdet - INFO - Epoch [10][1650/7330] lr: 1.000e-05, eta: 3:42:49, time: 0.643, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0345, loss_cls: 0.1415, acc: 94.6025, loss_bbox: 0.1944, loss_mask: 0.2090, loss: 0.5923 2024-06-01 11:12:13,803 - mmdet - INFO - Epoch [10][1700/7330] lr: 1.000e-05, eta: 3:42:16, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0363, loss_cls: 0.1510, acc: 94.2429, loss_bbox: 0.2038, loss_mask: 0.2214, loss: 0.6268 2024-06-01 11:12:45,770 - mmdet - INFO - Epoch [10][1750/7330] lr: 1.000e-05, eta: 3:41:43, time: 0.639, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0345, loss_cls: 0.1438, acc: 94.4746, loss_bbox: 0.1976, loss_mask: 0.2139, loss: 0.6020 2024-06-01 11:13:18,332 - mmdet - INFO - Epoch [10][1800/7330] lr: 1.000e-05, eta: 3:41:10, time: 0.651, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0385, loss_cls: 0.1551, acc: 94.1128, loss_bbox: 0.2091, loss_mask: 0.2169, loss: 0.6332 2024-06-01 11:13:52,595 - mmdet - INFO - Epoch [10][1850/7330] lr: 1.000e-05, eta: 3:40:37, time: 0.685, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0365, loss_cls: 0.1438, acc: 94.4697, loss_bbox: 0.2011, loss_mask: 0.2176, loss: 0.6122 2024-06-01 11:14:27,397 - mmdet - INFO - Epoch [10][1900/7330] lr: 1.000e-05, eta: 3:40:05, time: 0.696, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0354, loss_cls: 0.1478, acc: 94.3789, loss_bbox: 0.2010, loss_mask: 0.2159, loss: 0.6133 2024-06-01 11:14:59,269 - mmdet - INFO - Epoch [10][1950/7330] lr: 1.000e-05, eta: 3:39:32, time: 0.637, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0338, loss_cls: 0.1411, acc: 94.6150, loss_bbox: 0.1935, loss_mask: 0.2107, loss: 0.5923 2024-06-01 11:15:35,410 - mmdet - INFO - Epoch [10][2000/7330] lr: 1.000e-05, eta: 3:39:00, time: 0.723, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0358, loss_cls: 0.1455, acc: 94.4888, loss_bbox: 0.1977, loss_mask: 0.2158, loss: 0.6088 2024-06-01 11:16:09,837 - mmdet - INFO - Epoch [10][2050/7330] lr: 1.000e-05, eta: 3:38:27, time: 0.689, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0385, loss_cls: 0.1539, acc: 94.0964, loss_bbox: 0.2074, loss_mask: 0.2180, loss: 0.6325 2024-06-01 11:16:41,783 - mmdet - INFO - Epoch [10][2100/7330] lr: 1.000e-05, eta: 3:37:54, time: 0.639, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0346, loss_cls: 0.1492, acc: 94.3245, loss_bbox: 0.2052, loss_mask: 0.2130, loss: 0.6160 2024-06-01 11:17:18,360 - mmdet - INFO - Epoch [10][2150/7330] lr: 1.000e-05, eta: 3:37:23, time: 0.732, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0354, loss_cls: 0.1523, acc: 94.1819, loss_bbox: 0.2022, loss_mask: 0.2158, loss: 0.6188 2024-06-01 11:17:50,005 - mmdet - INFO - Epoch [10][2200/7330] lr: 1.000e-05, eta: 3:36:49, time: 0.633, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0341, loss_cls: 0.1385, acc: 94.6467, loss_bbox: 0.1911, loss_mask: 0.2129, loss: 0.5893 2024-06-01 11:18:22,121 - mmdet - INFO - Epoch [10][2250/7330] lr: 1.000e-05, eta: 3:36:16, time: 0.642, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0336, loss_cls: 0.1522, acc: 94.1790, loss_bbox: 0.2075, loss_mask: 0.2127, loss: 0.6188 2024-06-01 11:18:54,283 - mmdet - INFO - Epoch [10][2300/7330] lr: 1.000e-05, eta: 3:35:43, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0349, loss_cls: 0.1460, acc: 94.3923, loss_bbox: 0.1967, loss_mask: 0.2119, loss: 0.6017 2024-06-01 11:19:29,191 - mmdet - INFO - Epoch [10][2350/7330] lr: 1.000e-05, eta: 3:35:11, time: 0.698, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0361, loss_cls: 0.1485, acc: 94.3450, loss_bbox: 0.2044, loss_mask: 0.2155, loss: 0.6178 2024-06-01 11:20:01,356 - mmdet - INFO - Epoch [10][2400/7330] lr: 1.000e-05, eta: 3:34:38, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0340, loss_cls: 0.1386, acc: 94.7244, loss_bbox: 0.1916, loss_mask: 0.2114, loss: 0.5873 2024-06-01 11:20:33,485 - mmdet - INFO - Epoch [10][2450/7330] lr: 1.000e-05, eta: 3:34:05, time: 0.643, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0356, loss_cls: 0.1504, acc: 94.2893, loss_bbox: 0.2092, loss_mask: 0.2189, loss: 0.6277 2024-06-01 11:21:07,777 - mmdet - INFO - Epoch [10][2500/7330] lr: 1.000e-05, eta: 3:33:32, time: 0.686, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0361, loss_cls: 0.1510, acc: 94.1746, loss_bbox: 0.2092, loss_mask: 0.2175, loss: 0.6277 2024-06-01 11:21:40,167 - mmdet - INFO - Epoch [10][2550/7330] lr: 1.000e-05, eta: 3:32:59, time: 0.648, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0392, loss_cls: 0.1525, acc: 94.2144, loss_bbox: 0.2113, loss_mask: 0.2197, loss: 0.6367 2024-06-01 11:22:12,416 - mmdet - INFO - Epoch [10][2600/7330] lr: 1.000e-05, eta: 3:32:26, time: 0.645, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0374, loss_cls: 0.1421, acc: 94.5413, loss_bbox: 0.2003, loss_mask: 0.2106, loss: 0.6042 2024-06-01 11:22:44,415 - mmdet - INFO - Epoch [10][2650/7330] lr: 1.000e-05, eta: 3:31:53, time: 0.640, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0363, loss_cls: 0.1478, acc: 94.3271, loss_bbox: 0.2030, loss_mask: 0.2148, loss: 0.6156 2024-06-01 11:23:16,601 - mmdet - INFO - Epoch [10][2700/7330] lr: 1.000e-05, eta: 3:31:20, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0373, loss_cls: 0.1517, acc: 94.1272, loss_bbox: 0.2102, loss_mask: 0.2199, loss: 0.6331 2024-06-01 11:23:48,629 - mmdet - INFO - Epoch [10][2750/7330] lr: 1.000e-05, eta: 3:30:47, time: 0.641, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0349, loss_cls: 0.1422, acc: 94.5298, loss_bbox: 0.1981, loss_mask: 0.2190, loss: 0.6073 2024-06-01 11:24:20,475 - mmdet - INFO - Epoch [10][2800/7330] lr: 1.000e-05, eta: 3:30:14, time: 0.637, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0322, loss_cls: 0.1437, acc: 94.4551, loss_bbox: 0.1947, loss_mask: 0.2087, loss: 0.5917 2024-06-01 11:24:52,634 - mmdet - INFO - Epoch [10][2850/7330] lr: 1.000e-05, eta: 3:29:41, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0358, loss_cls: 0.1476, acc: 94.3774, loss_bbox: 0.2066, loss_mask: 0.2207, loss: 0.6246 2024-06-01 11:25:27,249 - mmdet - INFO - Epoch [10][2900/7330] lr: 1.000e-05, eta: 3:29:08, time: 0.692, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0381, loss_cls: 0.1529, acc: 94.1499, loss_bbox: 0.2114, loss_mask: 0.2195, loss: 0.6357 2024-06-01 11:26:02,121 - mmdet - INFO - Epoch [10][2950/7330] lr: 1.000e-05, eta: 3:28:36, time: 0.697, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0356, loss_cls: 0.1525, acc: 94.1077, loss_bbox: 0.2095, loss_mask: 0.2186, loss: 0.6305 2024-06-01 11:26:34,257 - mmdet - INFO - Epoch [10][3000/7330] lr: 1.000e-05, eta: 3:28:03, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0381, loss_cls: 0.1537, acc: 94.0002, loss_bbox: 0.2098, loss_mask: 0.2207, loss: 0.6365 2024-06-01 11:27:10,633 - mmdet - INFO - Epoch [10][3050/7330] lr: 1.000e-05, eta: 3:27:31, time: 0.728, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0373, loss_cls: 0.1534, acc: 94.1868, loss_bbox: 0.2039, loss_mask: 0.2147, loss: 0.6233 2024-06-01 11:27:42,674 - mmdet - INFO - Epoch [10][3100/7330] lr: 1.000e-05, eta: 3:26:58, time: 0.641, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0372, loss_cls: 0.1504, acc: 94.2014, loss_bbox: 0.2029, loss_mask: 0.2150, loss: 0.6195 2024-06-01 11:28:14,756 - mmdet - INFO - Epoch [10][3150/7330] lr: 1.000e-05, eta: 3:26:25, time: 0.642, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0394, loss_cls: 0.1546, acc: 94.0603, loss_bbox: 0.2115, loss_mask: 0.2255, loss: 0.6440 2024-06-01 11:28:51,702 - mmdet - INFO - Epoch [10][3200/7330] lr: 1.000e-05, eta: 3:25:53, time: 0.739, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0385, loss_cls: 0.1540, acc: 94.0698, loss_bbox: 0.2107, loss_mask: 0.2159, loss: 0.6329 2024-06-01 11:29:24,025 - mmdet - INFO - Epoch [10][3250/7330] lr: 1.000e-05, eta: 3:25:20, time: 0.646, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0368, loss_cls: 0.1547, acc: 94.1006, loss_bbox: 0.2107, loss_mask: 0.2169, loss: 0.6330 2024-06-01 11:29:56,665 - mmdet - INFO - Epoch [10][3300/7330] lr: 1.000e-05, eta: 3:24:47, time: 0.652, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0367, loss_cls: 0.1566, acc: 94.0640, loss_bbox: 0.2115, loss_mask: 0.2197, loss: 0.6389 2024-06-01 11:30:28,957 - mmdet - INFO - Epoch [10][3350/7330] lr: 1.000e-05, eta: 3:24:14, time: 0.646, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0362, loss_cls: 0.1521, acc: 94.2041, loss_bbox: 0.2093, loss_mask: 0.2182, loss: 0.6288 2024-06-01 11:31:03,700 - mmdet - INFO - Epoch [10][3400/7330] lr: 1.000e-05, eta: 3:23:42, time: 0.695, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0339, loss_cls: 0.1447, acc: 94.4897, loss_bbox: 0.2001, loss_mask: 0.2175, loss: 0.6085 2024-06-01 11:31:36,245 - mmdet - INFO - Epoch [10][3450/7330] lr: 1.000e-05, eta: 3:23:09, time: 0.651, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0379, loss_cls: 0.1527, acc: 94.0840, loss_bbox: 0.2120, loss_mask: 0.2209, loss: 0.6379 2024-06-01 11:32:08,402 - mmdet - INFO - Epoch [10][3500/7330] lr: 1.000e-05, eta: 3:22:36, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0380, loss_cls: 0.1518, acc: 94.2102, loss_bbox: 0.2083, loss_mask: 0.2154, loss: 0.6278 2024-06-01 11:32:42,578 - mmdet - INFO - Epoch [10][3550/7330] lr: 1.000e-05, eta: 3:22:03, time: 0.683, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0354, loss_cls: 0.1528, acc: 94.1597, loss_bbox: 0.2045, loss_mask: 0.2146, loss: 0.6203 2024-06-01 11:33:14,917 - mmdet - INFO - Epoch [10][3600/7330] lr: 1.000e-05, eta: 3:21:30, time: 0.647, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0389, loss_cls: 0.1542, acc: 94.0481, loss_bbox: 0.2102, loss_mask: 0.2169, loss: 0.6341 2024-06-01 11:33:47,212 - mmdet - INFO - Epoch [10][3650/7330] lr: 1.000e-05, eta: 3:20:57, time: 0.646, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0360, loss_cls: 0.1460, acc: 94.3918, loss_bbox: 0.1999, loss_mask: 0.2101, loss: 0.6049 2024-06-01 11:34:19,642 - mmdet - INFO - Epoch [10][3700/7330] lr: 1.000e-05, eta: 3:20:24, time: 0.649, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0385, loss_cls: 0.1524, acc: 94.1672, loss_bbox: 0.2063, loss_mask: 0.2140, loss: 0.6248 2024-06-01 11:34:52,184 - mmdet - INFO - Epoch [10][3750/7330] lr: 1.000e-05, eta: 3:19:51, time: 0.651, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0376, loss_cls: 0.1511, acc: 94.1169, loss_bbox: 0.2083, loss_mask: 0.2136, loss: 0.6239 2024-06-01 11:35:24,373 - mmdet - INFO - Epoch [10][3800/7330] lr: 1.000e-05, eta: 3:19:18, time: 0.644, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0369, loss_cls: 0.1538, acc: 94.1274, loss_bbox: 0.2059, loss_mask: 0.2121, loss: 0.6220 2024-06-01 11:35:56,719 - mmdet - INFO - Epoch [10][3850/7330] lr: 1.000e-05, eta: 3:18:45, time: 0.647, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0358, loss_cls: 0.1515, acc: 94.1953, loss_bbox: 0.2058, loss_mask: 0.2141, loss: 0.6214 2024-06-01 11:36:28,703 - mmdet - INFO - Epoch [10][3900/7330] lr: 1.000e-05, eta: 3:18:12, time: 0.640, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0357, loss_cls: 0.1458, acc: 94.3618, loss_bbox: 0.1991, loss_mask: 0.2116, loss: 0.6046 2024-06-01 11:37:02,979 - mmdet - INFO - Epoch [10][3950/7330] lr: 1.000e-05, eta: 3:17:40, time: 0.686, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0361, loss_cls: 0.1441, acc: 94.4788, loss_bbox: 0.1975, loss_mask: 0.2148, loss: 0.6047 2024-06-01 11:37:37,203 - mmdet - INFO - Epoch [10][4000/7330] lr: 1.000e-05, eta: 3:17:07, time: 0.684, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0353, loss_cls: 0.1460, acc: 94.4565, loss_bbox: 0.2008, loss_mask: 0.2138, loss: 0.6083 2024-06-01 11:38:09,955 - mmdet - INFO - Epoch [10][4050/7330] lr: 1.000e-05, eta: 3:16:34, time: 0.655, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0345, loss_cls: 0.1447, acc: 94.4160, loss_bbox: 0.1991, loss_mask: 0.2124, loss: 0.6032 2024-06-01 11:38:46,910 - mmdet - INFO - Epoch [10][4100/7330] lr: 1.000e-05, eta: 3:16:02, time: 0.739, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0371, loss_cls: 0.1504, acc: 94.2021, loss_bbox: 0.2047, loss_mask: 0.2164, loss: 0.6228 2024-06-01 11:39:18,917 - mmdet - INFO - Epoch [10][4150/7330] lr: 1.000e-05, eta: 3:15:29, time: 0.640, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0346, loss_cls: 0.1448, acc: 94.4175, loss_bbox: 0.2013, loss_mask: 0.2102, loss: 0.6029 2024-06-01 11:39:50,991 - mmdet - INFO - Epoch [10][4200/7330] lr: 1.000e-05, eta: 3:14:56, time: 0.641, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0360, loss_cls: 0.1502, acc: 94.3315, loss_bbox: 0.2040, loss_mask: 0.2143, loss: 0.6171 2024-06-01 11:40:27,340 - mmdet - INFO - Epoch [10][4250/7330] lr: 1.000e-05, eta: 3:14:24, time: 0.727, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0359, loss_cls: 0.1497, acc: 94.3101, loss_bbox: 0.2025, loss_mask: 0.2117, loss: 0.6129 2024-06-01 11:40:59,526 - mmdet - INFO - Epoch [10][4300/7330] lr: 1.000e-05, eta: 3:13:51, time: 0.644, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0362, loss_cls: 0.1554, acc: 94.0554, loss_bbox: 0.2115, loss_mask: 0.2241, loss: 0.6416 2024-06-01 11:41:32,026 - mmdet - INFO - Epoch [10][4350/7330] lr: 1.000e-05, eta: 3:13:18, time: 0.650, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0355, loss_cls: 0.1423, acc: 94.5723, loss_bbox: 0.1929, loss_mask: 0.2112, loss: 0.5943 2024-06-01 11:42:04,002 - mmdet - INFO - Epoch [10][4400/7330] lr: 1.000e-05, eta: 3:12:45, time: 0.640, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0384, loss_cls: 0.1551, acc: 94.0530, loss_bbox: 0.2070, loss_mask: 0.2192, loss: 0.6330 2024-06-01 11:42:38,695 - mmdet - INFO - Epoch [10][4450/7330] lr: 1.000e-05, eta: 3:12:13, time: 0.694, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0355, loss_cls: 0.1456, acc: 94.4656, loss_bbox: 0.1993, loss_mask: 0.2121, loss: 0.6056 2024-06-01 11:43:10,504 - mmdet - INFO - Epoch [10][4500/7330] lr: 1.000e-05, eta: 3:11:39, time: 0.636, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0365, loss_cls: 0.1502, acc: 94.2632, loss_bbox: 0.2065, loss_mask: 0.2187, loss: 0.6266 2024-06-01 11:43:42,517 - mmdet - INFO - Epoch [10][4550/7330] lr: 1.000e-05, eta: 3:11:06, time: 0.640, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0370, loss_cls: 0.1505, acc: 94.1711, loss_bbox: 0.2089, loss_mask: 0.2147, loss: 0.6255 2024-06-01 11:44:17,534 - mmdet - INFO - Epoch [10][4600/7330] lr: 1.000e-05, eta: 3:10:34, time: 0.700, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0355, loss_cls: 0.1529, acc: 94.0940, loss_bbox: 0.2053, loss_mask: 0.2123, loss: 0.6195 2024-06-01 11:44:49,576 - mmdet - INFO - Epoch [10][4650/7330] lr: 1.000e-05, eta: 3:10:01, time: 0.641, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0355, loss_cls: 0.1432, acc: 94.4646, loss_bbox: 0.2000, loss_mask: 0.2165, loss: 0.6074 2024-06-01 11:45:21,603 - mmdet - INFO - Epoch [10][4700/7330] lr: 1.000e-05, eta: 3:09:28, time: 0.641, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0341, loss_cls: 0.1440, acc: 94.4521, loss_bbox: 0.1964, loss_mask: 0.2080, loss: 0.5956 2024-06-01 11:45:53,786 - mmdet - INFO - Epoch [10][4750/7330] lr: 1.000e-05, eta: 3:08:55, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0373, loss_cls: 0.1521, acc: 94.1619, loss_bbox: 0.2135, loss_mask: 0.2196, loss: 0.6360 2024-06-01 11:46:25,935 - mmdet - INFO - Epoch [10][4800/7330] lr: 1.000e-05, eta: 3:08:22, time: 0.643, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0378, loss_cls: 0.1560, acc: 94.0212, loss_bbox: 0.2111, loss_mask: 0.2218, loss: 0.6403 2024-06-01 11:46:57,997 - mmdet - INFO - Epoch [10][4850/7330] lr: 1.000e-05, eta: 3:07:49, time: 0.641, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0362, loss_cls: 0.1518, acc: 94.1685, loss_bbox: 0.2087, loss_mask: 0.2205, loss: 0.6314 2024-06-01 11:47:29,995 - mmdet - INFO - Epoch [10][4900/7330] lr: 1.000e-05, eta: 3:07:16, time: 0.640, data_time: 0.023, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0362, loss_cls: 0.1443, acc: 94.4551, loss_bbox: 0.1990, loss_mask: 0.2094, loss: 0.6013 2024-06-01 11:48:02,167 - mmdet - INFO - Epoch [10][4950/7330] lr: 1.000e-05, eta: 3:06:43, time: 0.643, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0395, loss_cls: 0.1562, acc: 94.0901, loss_bbox: 0.2083, loss_mask: 0.2208, loss: 0.6401 2024-06-01 11:48:36,606 - mmdet - INFO - Epoch [10][5000/7330] lr: 1.000e-05, eta: 3:06:10, time: 0.689, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0365, loss_cls: 0.1452, acc: 94.4314, loss_bbox: 0.2005, loss_mask: 0.2104, loss: 0.6074 2024-06-01 11:49:12,478 - mmdet - INFO - Epoch [10][5050/7330] lr: 1.000e-05, eta: 3:05:38, time: 0.717, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0345, loss_cls: 0.1405, acc: 94.6963, loss_bbox: 0.1989, loss_mask: 0.2116, loss: 0.5979 2024-06-01 11:49:44,501 - mmdet - INFO - Epoch [10][5100/7330] lr: 1.000e-05, eta: 3:05:05, time: 0.640, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0343, loss_cls: 0.1465, acc: 94.3118, loss_bbox: 0.2061, loss_mask: 0.2210, loss: 0.6207 2024-06-01 11:50:21,372 - mmdet - INFO - Epoch [10][5150/7330] lr: 1.000e-05, eta: 3:04:33, time: 0.737, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0372, loss_cls: 0.1522, acc: 94.2212, loss_bbox: 0.2067, loss_mask: 0.2178, loss: 0.6283 2024-06-01 11:50:53,441 - mmdet - INFO - Epoch [10][5200/7330] lr: 1.000e-05, eta: 3:04:00, time: 0.641, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0368, loss_cls: 0.1536, acc: 94.1433, loss_bbox: 0.2079, loss_mask: 0.2201, loss: 0.6321 2024-06-01 11:51:27,867 - mmdet - INFO - Epoch [10][5250/7330] lr: 1.000e-05, eta: 3:03:27, time: 0.688, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0376, loss_cls: 0.1527, acc: 94.1816, loss_bbox: 0.2101, loss_mask: 0.2158, loss: 0.6303 2024-06-01 11:52:02,558 - mmdet - INFO - Epoch [10][5300/7330] lr: 1.000e-05, eta: 3:02:55, time: 0.694, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0372, loss_cls: 0.1573, acc: 93.9446, loss_bbox: 0.2165, loss_mask: 0.2230, loss: 0.6488 2024-06-01 11:52:34,732 - mmdet - INFO - Epoch [10][5350/7330] lr: 1.000e-05, eta: 3:02:22, time: 0.643, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0339, loss_cls: 0.1453, acc: 94.4971, loss_bbox: 0.1988, loss_mask: 0.2137, loss: 0.6043 2024-06-01 11:53:06,876 - mmdet - INFO - Epoch [10][5400/7330] lr: 1.000e-05, eta: 3:01:49, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0378, loss_cls: 0.1533, acc: 94.2419, loss_bbox: 0.2066, loss_mask: 0.2224, loss: 0.6338 2024-06-01 11:53:41,564 - mmdet - INFO - Epoch [10][5450/7330] lr: 1.000e-05, eta: 3:01:16, time: 0.694, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0363, loss_cls: 0.1501, acc: 94.2063, loss_bbox: 0.2088, loss_mask: 0.2196, loss: 0.6280 2024-06-01 11:54:13,900 - mmdet - INFO - Epoch [10][5500/7330] lr: 1.000e-05, eta: 3:00:43, time: 0.647, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0348, loss_cls: 0.1462, acc: 94.3491, loss_bbox: 0.2017, loss_mask: 0.2114, loss: 0.6067 2024-06-01 11:54:46,406 - mmdet - INFO - Epoch [10][5550/7330] lr: 1.000e-05, eta: 3:00:10, time: 0.650, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0359, loss_cls: 0.1536, acc: 94.1003, loss_bbox: 0.2074, loss_mask: 0.2175, loss: 0.6292 2024-06-01 11:55:18,402 - mmdet - INFO - Epoch [10][5600/7330] lr: 1.000e-05, eta: 2:59:37, time: 0.640, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0351, loss_cls: 0.1428, acc: 94.4995, loss_bbox: 0.1979, loss_mask: 0.2108, loss: 0.5995 2024-06-01 11:55:53,004 - mmdet - INFO - Epoch [10][5650/7330] lr: 1.000e-05, eta: 2:59:05, time: 0.692, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0366, loss_cls: 0.1460, acc: 94.3975, loss_bbox: 0.2079, loss_mask: 0.2152, loss: 0.6187 2024-06-01 11:56:25,313 - mmdet - INFO - Epoch [10][5700/7330] lr: 1.000e-05, eta: 2:58:32, time: 0.646, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0384, loss_cls: 0.1538, acc: 94.1013, loss_bbox: 0.2115, loss_mask: 0.2203, loss: 0.6384 2024-06-01 11:56:57,029 - mmdet - INFO - Epoch [10][5750/7330] lr: 1.000e-05, eta: 2:57:59, time: 0.634, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0358, loss_cls: 0.1426, acc: 94.5222, loss_bbox: 0.1948, loss_mask: 0.2094, loss: 0.5952 2024-06-01 11:57:29,247 - mmdet - INFO - Epoch [10][5800/7330] lr: 1.000e-05, eta: 2:57:26, time: 0.644, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0372, loss_cls: 0.1488, acc: 94.2859, loss_bbox: 0.2017, loss_mask: 0.2149, loss: 0.6172 2024-06-01 11:58:01,368 - mmdet - INFO - Epoch [10][5850/7330] lr: 1.000e-05, eta: 2:56:52, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0367, loss_cls: 0.1443, acc: 94.4468, loss_bbox: 0.1988, loss_mask: 0.2142, loss: 0.6077 2024-06-01 11:58:33,376 - mmdet - INFO - Epoch [10][5900/7330] lr: 1.000e-05, eta: 2:56:19, time: 0.640, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0355, loss_cls: 0.1445, acc: 94.5007, loss_bbox: 0.1964, loss_mask: 0.2136, loss: 0.6036 2024-06-01 11:59:05,351 - mmdet - INFO - Epoch [10][5950/7330] lr: 1.000e-05, eta: 2:55:46, time: 0.639, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0336, loss_cls: 0.1424, acc: 94.5098, loss_bbox: 0.1970, loss_mask: 0.2086, loss: 0.5945 2024-06-01 11:59:37,360 - mmdet - INFO - Epoch [10][6000/7330] lr: 1.000e-05, eta: 2:55:13, time: 0.640, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0354, loss_cls: 0.1443, acc: 94.4182, loss_bbox: 0.1991, loss_mask: 0.2110, loss: 0.6026 2024-06-01 12:00:11,399 - mmdet - INFO - Epoch [10][6050/7330] lr: 1.000e-05, eta: 2:54:41, time: 0.681, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0346, loss_cls: 0.1439, acc: 94.5078, loss_bbox: 0.1965, loss_mask: 0.2115, loss: 0.5983 2024-06-01 12:00:47,037 - mmdet - INFO - Epoch [10][6100/7330] lr: 1.000e-05, eta: 2:54:08, time: 0.713, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0355, loss_cls: 0.1487, acc: 94.2559, loss_bbox: 0.2023, loss_mask: 0.2130, loss: 0.6131 2024-06-01 12:01:19,550 - mmdet - INFO - Epoch [10][6150/7330] lr: 1.000e-05, eta: 2:53:35, time: 0.650, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0379, loss_cls: 0.1542, acc: 94.0713, loss_bbox: 0.2084, loss_mask: 0.2174, loss: 0.6328 2024-06-01 12:01:56,556 - mmdet - INFO - Epoch [10][6200/7330] lr: 1.000e-05, eta: 2:53:03, time: 0.741, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0359, loss_cls: 0.1474, acc: 94.3513, loss_bbox: 0.2011, loss_mask: 0.2179, loss: 0.6163 2024-06-01 12:02:28,820 - mmdet - INFO - Epoch [10][6250/7330] lr: 1.000e-05, eta: 2:52:30, time: 0.645, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0350, loss_cls: 0.1512, acc: 94.1455, loss_bbox: 0.2053, loss_mask: 0.2140, loss: 0.6185 2024-06-01 12:03:05,601 - mmdet - INFO - Epoch [10][6300/7330] lr: 1.000e-05, eta: 2:51:58, time: 0.736, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0373, loss_cls: 0.1538, acc: 94.0876, loss_bbox: 0.2119, loss_mask: 0.2211, loss: 0.6383 2024-06-01 12:03:37,823 - mmdet - INFO - Epoch [10][6350/7330] lr: 1.000e-05, eta: 2:51:25, time: 0.645, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0346, loss_cls: 0.1434, acc: 94.5195, loss_bbox: 0.1905, loss_mask: 0.2101, loss: 0.5923 2024-06-01 12:04:10,304 - mmdet - INFO - Epoch [10][6400/7330] lr: 1.000e-05, eta: 2:50:52, time: 0.650, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0353, loss_cls: 0.1506, acc: 94.2776, loss_bbox: 0.2051, loss_mask: 0.2170, loss: 0.6220 2024-06-01 12:04:42,687 - mmdet - INFO - Epoch [10][6450/7330] lr: 1.000e-05, eta: 2:50:19, time: 0.648, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0356, loss_cls: 0.1476, acc: 94.3120, loss_bbox: 0.2061, loss_mask: 0.2205, loss: 0.6232 2024-06-01 12:05:16,633 - mmdet - INFO - Epoch [10][6500/7330] lr: 1.000e-05, eta: 2:49:47, time: 0.679, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0364, loss_cls: 0.1456, acc: 94.4155, loss_bbox: 0.2012, loss_mask: 0.2146, loss: 0.6096 2024-06-01 12:05:48,802 - mmdet - INFO - Epoch [10][6550/7330] lr: 1.000e-05, eta: 2:49:14, time: 0.643, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0357, loss_cls: 0.1539, acc: 94.1353, loss_bbox: 0.2067, loss_mask: 0.2172, loss: 0.6282 2024-06-01 12:06:21,263 - mmdet - INFO - Epoch [10][6600/7330] lr: 1.000e-05, eta: 2:48:41, time: 0.649, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0355, loss_cls: 0.1475, acc: 94.3125, loss_bbox: 0.2005, loss_mask: 0.2176, loss: 0.6140 2024-06-01 12:06:53,400 - mmdet - INFO - Epoch [10][6650/7330] lr: 1.000e-05, eta: 2:48:08, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0384, loss_cls: 0.1541, acc: 94.1377, loss_bbox: 0.2066, loss_mask: 0.2177, loss: 0.6318 2024-06-01 12:07:27,719 - mmdet - INFO - Epoch [10][6700/7330] lr: 1.000e-05, eta: 2:47:35, time: 0.686, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0379, loss_cls: 0.1532, acc: 94.1060, loss_bbox: 0.2110, loss_mask: 0.2230, loss: 0.6404 2024-06-01 12:07:59,977 - mmdet - INFO - Epoch [10][6750/7330] lr: 1.000e-05, eta: 2:47:02, time: 0.645, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0374, loss_cls: 0.1474, acc: 94.3262, loss_bbox: 0.2081, loss_mask: 0.2174, loss: 0.6237 2024-06-01 12:08:32,079 - mmdet - INFO - Epoch [10][6800/7330] lr: 1.000e-05, eta: 2:46:29, time: 0.642, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0360, loss_cls: 0.1490, acc: 94.3059, loss_bbox: 0.2030, loss_mask: 0.2140, loss: 0.6163 2024-06-01 12:09:04,652 - mmdet - INFO - Epoch [10][6850/7330] lr: 1.000e-05, eta: 2:45:56, time: 0.651, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0381, loss_cls: 0.1553, acc: 94.0986, loss_bbox: 0.2167, loss_mask: 0.2182, loss: 0.6428 2024-06-01 12:09:37,143 - mmdet - INFO - Epoch [10][6900/7330] lr: 1.000e-05, eta: 2:45:23, time: 0.650, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0357, loss_cls: 0.1496, acc: 94.2937, loss_bbox: 0.2115, loss_mask: 0.2213, loss: 0.6324 2024-06-01 12:10:09,523 - mmdet - INFO - Epoch [10][6950/7330] lr: 1.000e-05, eta: 2:44:50, time: 0.648, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0361, loss_cls: 0.1457, acc: 94.3982, loss_bbox: 0.2028, loss_mask: 0.2135, loss: 0.6119 2024-06-01 12:10:41,837 - mmdet - INFO - Epoch [10][7000/7330] lr: 1.000e-05, eta: 2:44:17, time: 0.646, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0357, loss_cls: 0.1513, acc: 94.2598, loss_bbox: 0.2069, loss_mask: 0.2169, loss: 0.6246 2024-06-01 12:11:13,942 - mmdet - INFO - Epoch [10][7050/7330] lr: 1.000e-05, eta: 2:43:44, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0357, loss_cls: 0.1468, acc: 94.4436, loss_bbox: 0.2008, loss_mask: 0.2176, loss: 0.6145 2024-06-01 12:11:48,473 - mmdet - INFO - Epoch [10][7100/7330] lr: 1.000e-05, eta: 2:43:12, time: 0.691, data_time: 0.039, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0386, loss_cls: 0.1498, acc: 94.1968, loss_bbox: 0.2070, loss_mask: 0.2150, loss: 0.6240 2024-06-01 12:12:23,684 - mmdet - INFO - Epoch [10][7150/7330] lr: 1.000e-05, eta: 2:42:39, time: 0.704, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0358, loss_cls: 0.1477, acc: 94.3572, loss_bbox: 0.1998, loss_mask: 0.2107, loss: 0.6071 2024-06-01 12:12:58,325 - mmdet - INFO - Epoch [10][7200/7330] lr: 1.000e-05, eta: 2:42:06, time: 0.644, data_time: 0.039, memory: 13277, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0390, loss_cls: 0.1579, acc: 93.9329, loss_bbox: 0.2153, loss_mask: 0.2236, loss: 0.6508 2024-06-01 12:13:32,860 - mmdet - INFO - Epoch [10][7250/7330] lr: 1.000e-05, eta: 2:41:34, time: 0.739, data_time: 0.078, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0352, loss_cls: 0.1441, acc: 94.4468, loss_bbox: 0.1949, loss_mask: 0.2142, loss: 0.6018 2024-06-01 12:14:04,979 - mmdet - INFO - Epoch [10][7300/7330] lr: 1.000e-05, eta: 2:41:01, time: 0.642, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0367, loss_cls: 0.1485, acc: 94.3130, loss_bbox: 0.2061, loss_mask: 0.2169, loss: 0.6220 2024-06-01 12:14:29,030 - mmdet - INFO - Saving checkpoint at 10 epochs 2024-06-01 12:16:05,217 - mmdet - INFO - Evaluating bbox... 2024-06-01 12:16:29,759 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.482 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.712 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.523 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.291 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.525 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.657 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.393 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.638 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.763 2024-06-01 12:16:29,759 - mmdet - INFO - Evaluating segm... 2024-06-01 12:16:57,143 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.424 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.673 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.450 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.204 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.460 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.646 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.526 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.526 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.526 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.314 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.575 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.720 2024-06-01 12:16:57,680 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 12:16:57,681 - mmdet - INFO - Epoch(val) [10][625] bbox_mAP: 0.4820, bbox_mAP_50: 0.7120, bbox_mAP_75: 0.5230, bbox_mAP_s: 0.2910, bbox_mAP_m: 0.5250, bbox_mAP_l: 0.6570, bbox_mAP_copypaste: 0.482 0.712 0.523 0.291 0.525 0.657, segm_mAP: 0.4240, segm_mAP_50: 0.6730, segm_mAP_75: 0.4500, segm_mAP_s: 0.2040, segm_mAP_m: 0.4600, segm_mAP_l: 0.6460, segm_mAP_copypaste: 0.424 0.673 0.450 0.204 0.460 0.646 2024-06-01 12:17:36,900 - mmdet - INFO - Epoch [11][50/7330] lr: 1.000e-05, eta: 2:40:06, time: 0.784, data_time: 0.110, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0348, loss_cls: 0.1466, acc: 94.3201, loss_bbox: 0.2020, loss_mask: 0.2139, loss: 0.6096 2024-06-01 12:18:11,213 - mmdet - INFO - Epoch [11][100/7330] lr: 1.000e-05, eta: 2:39:33, time: 0.686, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0348, loss_cls: 0.1414, acc: 94.5879, loss_bbox: 0.1944, loss_mask: 0.2115, loss: 0.5959 2024-06-01 12:18:43,525 - mmdet - INFO - Epoch [11][150/7330] lr: 1.000e-05, eta: 2:39:00, time: 0.646, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0373, loss_cls: 0.1489, acc: 94.2993, loss_bbox: 0.2060, loss_mask: 0.2214, loss: 0.6280 2024-06-01 12:19:15,706 - mmdet - INFO - Epoch [11][200/7330] lr: 1.000e-05, eta: 2:38:27, time: 0.644, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0371, loss_cls: 0.1493, acc: 94.2678, loss_bbox: 0.2013, loss_mask: 0.2132, loss: 0.6140 2024-06-01 12:19:48,353 - mmdet - INFO - Epoch [11][250/7330] lr: 1.000e-05, eta: 2:37:54, time: 0.653, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0389, loss_cls: 0.1604, acc: 93.8975, loss_bbox: 0.2204, loss_mask: 0.2247, loss: 0.6605 2024-06-01 12:20:20,491 - mmdet - INFO - Epoch [11][300/7330] lr: 1.000e-05, eta: 2:37:21, time: 0.643, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0339, loss_cls: 0.1442, acc: 94.5566, loss_bbox: 0.1924, loss_mask: 0.2134, loss: 0.5964 2024-06-01 12:20:52,629 - mmdet - INFO - Epoch [11][350/7330] lr: 1.000e-05, eta: 2:36:48, time: 0.643, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0357, loss_cls: 0.1437, acc: 94.4990, loss_bbox: 0.2004, loss_mask: 0.2106, loss: 0.6040 2024-06-01 12:21:25,344 - mmdet - INFO - Epoch [11][400/7330] lr: 1.000e-05, eta: 2:36:15, time: 0.654, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0380, loss_cls: 0.1542, acc: 94.1138, loss_bbox: 0.2116, loss_mask: 0.2194, loss: 0.6377 2024-06-01 12:21:57,786 - mmdet - INFO - Epoch [11][450/7330] lr: 1.000e-05, eta: 2:35:42, time: 0.649, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0373, loss_cls: 0.1461, acc: 94.2507, loss_bbox: 0.2044, loss_mask: 0.2178, loss: 0.6195 2024-06-01 12:22:32,362 - mmdet - INFO - Epoch [11][500/7330] lr: 1.000e-05, eta: 2:35:10, time: 0.692, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0358, loss_cls: 0.1469, acc: 94.3525, loss_bbox: 0.2029, loss_mask: 0.2149, loss: 0.6140 2024-06-01 12:23:04,670 - mmdet - INFO - Epoch [11][550/7330] lr: 1.000e-05, eta: 2:34:37, time: 0.646, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0368, loss_cls: 0.1516, acc: 94.1155, loss_bbox: 0.2083, loss_mask: 0.2196, loss: 0.6302 2024-06-01 12:23:36,775 - mmdet - INFO - Epoch [11][600/7330] lr: 1.000e-05, eta: 2:34:04, time: 0.642, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0354, loss_cls: 0.1476, acc: 94.3677, loss_bbox: 0.2037, loss_mask: 0.2178, loss: 0.6173 2024-06-01 12:24:08,767 - mmdet - INFO - Epoch [11][650/7330] lr: 1.000e-05, eta: 2:33:31, time: 0.640, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0346, loss_cls: 0.1427, acc: 94.4985, loss_bbox: 0.1970, loss_mask: 0.2104, loss: 0.5967 2024-06-01 12:24:41,219 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 12:24:41,219 - mmdet - INFO - Epoch [11][700/7330] lr: 1.000e-05, eta: 2:32:58, time: 0.649, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0363, loss_cls: 0.1475, acc: 94.3535, loss_bbox: 0.2036, loss_mask: 0.2135, loss: 0.6126 2024-06-01 12:25:13,382 - mmdet - INFO - Epoch [11][750/7330] lr: 1.000e-05, eta: 2:32:25, time: 0.643, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0372, loss_cls: 0.1501, acc: 94.1335, loss_bbox: 0.2047, loss_mask: 0.2129, loss: 0.6188 2024-06-01 12:25:45,536 - mmdet - INFO - Epoch [11][800/7330] lr: 1.000e-05, eta: 2:31:52, time: 0.643, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0326, loss_cls: 0.1383, acc: 94.6660, loss_bbox: 0.1908, loss_mask: 0.2100, loss: 0.5833 2024-06-01 12:26:17,732 - mmdet - INFO - Epoch [11][850/7330] lr: 1.000e-05, eta: 2:31:19, time: 0.644, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0348, loss_cls: 0.1418, acc: 94.5310, loss_bbox: 0.1990, loss_mask: 0.2144, loss: 0.6025 2024-06-01 12:26:52,416 - mmdet - INFO - Epoch [11][900/7330] lr: 1.000e-05, eta: 2:30:46, time: 0.694, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0344, loss_cls: 0.1430, acc: 94.5933, loss_bbox: 0.1911, loss_mask: 0.2115, loss: 0.5918 2024-06-01 12:27:25,410 - mmdet - INFO - Epoch [11][950/7330] lr: 1.000e-05, eta: 2:30:13, time: 0.660, data_time: 0.054, memory: 13277, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0381, loss_cls: 0.1547, acc: 94.0605, loss_bbox: 0.2061, loss_mask: 0.2138, loss: 0.6267 2024-06-01 12:27:57,763 - mmdet - INFO - Epoch [11][1000/7330] lr: 1.000e-05, eta: 2:29:40, time: 0.647, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0385, loss_cls: 0.1458, acc: 94.3975, loss_bbox: 0.1989, loss_mask: 0.2122, loss: 0.6093 2024-06-01 12:28:30,221 - mmdet - INFO - Epoch [11][1050/7330] lr: 1.000e-05, eta: 2:29:07, time: 0.649, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0370, loss_cls: 0.1519, acc: 94.1890, loss_bbox: 0.2060, loss_mask: 0.2155, loss: 0.6239 2024-06-01 12:29:06,568 - mmdet - INFO - Epoch [11][1100/7330] lr: 1.000e-05, eta: 2:28:35, time: 0.727, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0338, loss_cls: 0.1436, acc: 94.4619, loss_bbox: 0.1975, loss_mask: 0.2124, loss: 0.5990 2024-06-01 12:29:40,783 - mmdet - INFO - Epoch [11][1150/7330] lr: 1.000e-05, eta: 2:28:03, time: 0.684, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0359, loss_cls: 0.1519, acc: 94.2478, loss_bbox: 0.2052, loss_mask: 0.2190, loss: 0.6248 2024-06-01 12:30:12,906 - mmdet - INFO - Epoch [11][1200/7330] lr: 1.000e-05, eta: 2:27:30, time: 0.642, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0369, loss_cls: 0.1556, acc: 94.0005, loss_bbox: 0.2158, loss_mask: 0.2223, loss: 0.6436 2024-06-01 12:30:44,919 - mmdet - INFO - Epoch [11][1250/7330] lr: 1.000e-05, eta: 2:26:57, time: 0.640, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0334, loss_cls: 0.1402, acc: 94.6609, loss_bbox: 0.1957, loss_mask: 0.2069, loss: 0.5902 2024-06-01 12:31:17,009 - mmdet - INFO - Epoch [11][1300/7330] lr: 1.000e-05, eta: 2:26:24, time: 0.642, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0369, loss_cls: 0.1490, acc: 94.3396, loss_bbox: 0.2044, loss_mask: 0.2161, loss: 0.6201 2024-06-01 12:31:51,835 - mmdet - INFO - Epoch [11][1350/7330] lr: 1.000e-05, eta: 2:25:51, time: 0.697, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0369, loss_cls: 0.1440, acc: 94.4751, loss_bbox: 0.2003, loss_mask: 0.2103, loss: 0.6050 2024-06-01 12:32:26,910 - mmdet - INFO - Epoch [11][1400/7330] lr: 1.000e-05, eta: 2:25:19, time: 0.701, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0358, loss_cls: 0.1451, acc: 94.3677, loss_bbox: 0.2030, loss_mask: 0.2209, loss: 0.6177 2024-06-01 12:33:01,315 - mmdet - INFO - Epoch [11][1450/7330] lr: 1.000e-05, eta: 2:24:46, time: 0.688, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0342, loss_cls: 0.1402, acc: 94.5518, loss_bbox: 0.1944, loss_mask: 0.2059, loss: 0.5856 2024-06-01 12:33:33,589 - mmdet - INFO - Epoch [11][1500/7330] lr: 1.000e-05, eta: 2:24:13, time: 0.645, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0362, loss_cls: 0.1472, acc: 94.3328, loss_bbox: 0.2076, loss_mask: 0.2172, loss: 0.6195 2024-06-01 12:34:09,039 - mmdet - INFO - Epoch [11][1550/7330] lr: 1.000e-05, eta: 2:23:41, time: 0.709, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0372, loss_cls: 0.1484, acc: 94.2585, loss_bbox: 0.2016, loss_mask: 0.2173, loss: 0.6183 2024-06-01 12:34:40,876 - mmdet - INFO - Epoch [11][1600/7330] lr: 1.000e-05, eta: 2:23:07, time: 0.637, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0329, loss_cls: 0.1361, acc: 94.8018, loss_bbox: 0.1885, loss_mask: 0.2069, loss: 0.5766 2024-06-01 12:35:13,082 - mmdet - INFO - Epoch [11][1650/7330] lr: 1.000e-05, eta: 2:22:34, time: 0.644, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0339, loss_cls: 0.1386, acc: 94.6680, loss_bbox: 0.1949, loss_mask: 0.2096, loss: 0.5893 2024-06-01 12:35:45,452 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 12:35:45,453 - mmdet - INFO - Epoch [11][1700/7330] lr: 1.000e-05, eta: 2:22:02, time: 0.647, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0362, loss_cls: 0.1405, acc: 94.5703, loss_bbox: 0.1987, loss_mask: 0.2098, loss: 0.5966 2024-06-01 12:36:17,534 - mmdet - INFO - Epoch [11][1750/7330] lr: 1.000e-05, eta: 2:21:28, time: 0.642, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0365, loss_cls: 0.1470, acc: 94.3101, loss_bbox: 0.2038, loss_mask: 0.2190, loss: 0.6190 2024-06-01 12:36:49,915 - mmdet - INFO - Epoch [11][1800/7330] lr: 1.000e-05, eta: 2:20:56, time: 0.648, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0369, loss_cls: 0.1495, acc: 94.2190, loss_bbox: 0.2075, loss_mask: 0.2156, loss: 0.6224 2024-06-01 12:37:22,122 - mmdet - INFO - Epoch [11][1850/7330] lr: 1.000e-05, eta: 2:20:23, time: 0.644, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0339, loss_cls: 0.1457, acc: 94.3706, loss_bbox: 0.2005, loss_mask: 0.2127, loss: 0.6059 2024-06-01 12:37:53,951 - mmdet - INFO - Epoch [11][1900/7330] lr: 1.000e-05, eta: 2:19:49, time: 0.637, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0331, loss_cls: 0.1384, acc: 94.6218, loss_bbox: 0.1927, loss_mask: 0.2060, loss: 0.5819 2024-06-01 12:38:28,556 - mmdet - INFO - Epoch [11][1950/7330] lr: 1.000e-05, eta: 2:19:17, time: 0.692, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0359, loss_cls: 0.1448, acc: 94.4214, loss_bbox: 0.2003, loss_mask: 0.2140, loss: 0.6081 2024-06-01 12:39:00,871 - mmdet - INFO - Epoch [11][2000/7330] lr: 1.000e-05, eta: 2:18:44, time: 0.646, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0368, loss_cls: 0.1497, acc: 94.3384, loss_bbox: 0.2029, loss_mask: 0.2149, loss: 0.6169 2024-06-01 12:39:33,732 - mmdet - INFO - Epoch [11][2050/7330] lr: 1.000e-05, eta: 2:18:11, time: 0.657, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0371, loss_cls: 0.1577, acc: 93.9382, loss_bbox: 0.2144, loss_mask: 0.2175, loss: 0.6410 2024-06-01 12:40:06,200 - mmdet - INFO - Epoch [11][2100/7330] lr: 1.000e-05, eta: 2:17:38, time: 0.650, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0352, loss_cls: 0.1462, acc: 94.3699, loss_bbox: 0.1996, loss_mask: 0.2116, loss: 0.6057 2024-06-01 12:40:43,138 - mmdet - INFO - Epoch [11][2150/7330] lr: 1.000e-05, eta: 2:17:06, time: 0.738, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0349, loss_cls: 0.1458, acc: 94.4480, loss_bbox: 0.1994, loss_mask: 0.2161, loss: 0.6105 2024-06-01 12:41:17,703 - mmdet - INFO - Epoch [11][2200/7330] lr: 1.000e-05, eta: 2:16:33, time: 0.692, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0355, loss_cls: 0.1415, acc: 94.6174, loss_bbox: 0.1951, loss_mask: 0.2089, loss: 0.5937 2024-06-01 12:41:49,791 - mmdet - INFO - Epoch [11][2250/7330] lr: 1.000e-05, eta: 2:16:00, time: 0.642, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0362, loss_cls: 0.1478, acc: 94.3562, loss_bbox: 0.2035, loss_mask: 0.2128, loss: 0.6134 2024-06-01 12:42:21,995 - mmdet - INFO - Epoch [11][2300/7330] lr: 1.000e-05, eta: 2:15:27, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0353, loss_cls: 0.1441, acc: 94.4143, loss_bbox: 0.2029, loss_mask: 0.2188, loss: 0.6149 2024-06-01 12:42:54,570 - mmdet - INFO - Epoch [11][2350/7330] lr: 1.000e-05, eta: 2:14:54, time: 0.651, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0374, loss_cls: 0.1506, acc: 94.1313, loss_bbox: 0.2034, loss_mask: 0.2190, loss: 0.6238 2024-06-01 12:43:29,093 - mmdet - INFO - Epoch [11][2400/7330] lr: 1.000e-05, eta: 2:14:22, time: 0.690, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0362, loss_cls: 0.1450, acc: 94.4517, loss_bbox: 0.2020, loss_mask: 0.2172, loss: 0.6129 2024-06-01 12:44:03,857 - mmdet - INFO - Epoch [11][2450/7330] lr: 1.000e-05, eta: 2:13:49, time: 0.695, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0348, loss_cls: 0.1460, acc: 94.4326, loss_bbox: 0.1990, loss_mask: 0.2126, loss: 0.6049 2024-06-01 12:44:38,779 - mmdet - INFO - Epoch [11][2500/7330] lr: 1.000e-05, eta: 2:13:17, time: 0.698, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0374, loss_cls: 0.1503, acc: 94.2385, loss_bbox: 0.2025, loss_mask: 0.2153, loss: 0.6187 2024-06-01 12:45:10,677 - mmdet - INFO - Epoch [11][2550/7330] lr: 1.000e-05, eta: 2:12:44, time: 0.638, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0334, loss_cls: 0.1408, acc: 94.5552, loss_bbox: 0.1953, loss_mask: 0.2096, loss: 0.5901 2024-06-01 12:45:46,235 - mmdet - INFO - Epoch [11][2600/7330] lr: 1.000e-05, eta: 2:12:11, time: 0.711, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0362, loss_cls: 0.1460, acc: 94.4443, loss_bbox: 0.1976, loss_mask: 0.2158, loss: 0.6085 2024-06-01 12:46:18,457 - mmdet - INFO - Epoch [11][2650/7330] lr: 1.000e-05, eta: 2:11:38, time: 0.644, data_time: 0.039, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0388, loss_cls: 0.1499, acc: 94.2246, loss_bbox: 0.2100, loss_mask: 0.2183, loss: 0.6303 2024-06-01 12:46:50,626 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 12:46:50,626 - mmdet - INFO - Epoch [11][2700/7330] lr: 1.000e-05, eta: 2:11:05, time: 0.643, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0365, loss_cls: 0.1506, acc: 94.2119, loss_bbox: 0.2056, loss_mask: 0.2163, loss: 0.6218 2024-06-01 12:47:22,831 - mmdet - INFO - Epoch [11][2750/7330] lr: 1.000e-05, eta: 2:10:32, time: 0.644, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0363, loss_cls: 0.1436, acc: 94.4465, loss_bbox: 0.1980, loss_mask: 0.2151, loss: 0.6062 2024-06-01 12:47:55,337 - mmdet - INFO - Epoch [11][2800/7330] lr: 1.000e-05, eta: 2:09:59, time: 0.650, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0366, loss_cls: 0.1507, acc: 94.1575, loss_bbox: 0.2082, loss_mask: 0.2211, loss: 0.6294 2024-06-01 12:48:27,850 - mmdet - INFO - Epoch [11][2850/7330] lr: 1.000e-05, eta: 2:09:26, time: 0.650, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0399, loss_cls: 0.1583, acc: 94.0068, loss_bbox: 0.2099, loss_mask: 0.2175, loss: 0.6418 2024-06-01 12:48:59,912 - mmdet - INFO - Epoch [11][2900/7330] lr: 1.000e-05, eta: 2:08:53, time: 0.641, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0353, loss_cls: 0.1457, acc: 94.3979, loss_bbox: 0.2024, loss_mask: 0.2183, loss: 0.6138 2024-06-01 12:49:32,229 - mmdet - INFO - Epoch [11][2950/7330] lr: 1.000e-05, eta: 2:08:20, time: 0.646, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0367, loss_cls: 0.1467, acc: 94.2690, loss_bbox: 0.2031, loss_mask: 0.2139, loss: 0.6138 2024-06-01 12:50:07,358 - mmdet - INFO - Epoch [11][3000/7330] lr: 1.000e-05, eta: 2:07:48, time: 0.703, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0368, loss_cls: 0.1469, acc: 94.4617, loss_bbox: 0.1984, loss_mask: 0.2102, loss: 0.6058 2024-06-01 12:50:39,976 - mmdet - INFO - Epoch [11][3050/7330] lr: 1.000e-05, eta: 2:07:15, time: 0.652, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0375, loss_cls: 0.1480, acc: 94.3892, loss_bbox: 0.2067, loss_mask: 0.2186, loss: 0.6248 2024-06-01 12:51:12,650 - mmdet - INFO - Epoch [11][3100/7330] lr: 1.000e-05, eta: 2:06:42, time: 0.654, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0393, loss_cls: 0.1558, acc: 93.9905, loss_bbox: 0.2140, loss_mask: 0.2177, loss: 0.6413 2024-06-01 12:51:47,727 - mmdet - INFO - Epoch [11][3150/7330] lr: 1.000e-05, eta: 2:06:09, time: 0.702, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0396, loss_cls: 0.1557, acc: 94.0264, loss_bbox: 0.2146, loss_mask: 0.2204, loss: 0.6452 2024-06-01 12:52:22,841 - mmdet - INFO - Epoch [11][3200/7330] lr: 1.000e-05, eta: 2:05:37, time: 0.702, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0369, loss_cls: 0.1463, acc: 94.4380, loss_bbox: 0.2028, loss_mask: 0.2160, loss: 0.6158 2024-06-01 12:52:57,504 - mmdet - INFO - Epoch [11][3250/7330] lr: 1.000e-05, eta: 2:05:04, time: 0.693, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0365, loss_cls: 0.1497, acc: 94.2856, loss_bbox: 0.2089, loss_mask: 0.2211, loss: 0.6300 2024-06-01 12:53:29,946 - mmdet - INFO - Epoch [11][3300/7330] lr: 1.000e-05, eta: 2:04:31, time: 0.649, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0367, loss_cls: 0.1522, acc: 94.1387, loss_bbox: 0.2048, loss_mask: 0.2182, loss: 0.6253 2024-06-01 12:54:02,097 - mmdet - INFO - Epoch [11][3350/7330] lr: 1.000e-05, eta: 2:03:58, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0336, loss_cls: 0.1388, acc: 94.7349, loss_bbox: 0.1901, loss_mask: 0.2079, loss: 0.5826 2024-06-01 12:54:34,230 - mmdet - INFO - Epoch [11][3400/7330] lr: 1.000e-05, eta: 2:03:25, time: 0.643, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0353, loss_cls: 0.1466, acc: 94.3884, loss_bbox: 0.2013, loss_mask: 0.2125, loss: 0.6096 2024-06-01 12:55:09,629 - mmdet - INFO - Epoch [11][3450/7330] lr: 1.000e-05, eta: 2:02:53, time: 0.708, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0349, loss_cls: 0.1419, acc: 94.4875, loss_bbox: 0.1991, loss_mask: 0.2058, loss: 0.5940 2024-06-01 12:55:43,894 - mmdet - INFO - Epoch [11][3500/7330] lr: 1.000e-05, eta: 2:02:20, time: 0.685, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0353, loss_cls: 0.1455, acc: 94.4319, loss_bbox: 0.2034, loss_mask: 0.2143, loss: 0.6114 2024-06-01 12:56:18,479 - mmdet - INFO - Epoch [11][3550/7330] lr: 1.000e-05, eta: 2:01:47, time: 0.692, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0361, loss_cls: 0.1512, acc: 94.2197, loss_bbox: 0.2010, loss_mask: 0.2153, loss: 0.6171 2024-06-01 12:56:50,650 - mmdet - INFO - Epoch [11][3600/7330] lr: 1.000e-05, eta: 2:01:14, time: 0.643, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0340, loss_cls: 0.1391, acc: 94.6406, loss_bbox: 0.1939, loss_mask: 0.2097, loss: 0.5896 2024-06-01 12:57:25,801 - mmdet - INFO - Epoch [11][3650/7330] lr: 1.000e-05, eta: 2:00:42, time: 0.703, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0338, loss_cls: 0.1447, acc: 94.3608, loss_bbox: 0.1994, loss_mask: 0.2141, loss: 0.6043 2024-06-01 12:57:58,196 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 12:57:58,196 - mmdet - INFO - Epoch [11][3700/7330] lr: 1.000e-05, eta: 2:00:09, time: 0.648, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0381, loss_cls: 0.1544, acc: 94.1042, loss_bbox: 0.2138, loss_mask: 0.2171, loss: 0.6361 2024-06-01 12:58:30,421 - mmdet - INFO - Epoch [11][3750/7330] lr: 1.000e-05, eta: 1:59:36, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0358, loss_cls: 0.1417, acc: 94.4666, loss_bbox: 0.2033, loss_mask: 0.2121, loss: 0.6055 2024-06-01 12:59:02,755 - mmdet - INFO - Epoch [11][3800/7330] lr: 1.000e-05, eta: 1:59:03, time: 0.647, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0368, loss_cls: 0.1524, acc: 94.1621, loss_bbox: 0.2057, loss_mask: 0.2179, loss: 0.6272 2024-06-01 12:59:35,337 - mmdet - INFO - Epoch [11][3850/7330] lr: 1.000e-05, eta: 1:58:30, time: 0.652, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0365, loss_cls: 0.1485, acc: 94.3711, loss_bbox: 0.1989, loss_mask: 0.2089, loss: 0.6054 2024-06-01 13:00:07,519 - mmdet - INFO - Epoch [11][3900/7330] lr: 1.000e-05, eta: 1:57:57, time: 0.644, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0372, loss_cls: 0.1523, acc: 94.2344, loss_bbox: 0.2099, loss_mask: 0.2210, loss: 0.6342 2024-06-01 13:00:39,264 - mmdet - INFO - Epoch [11][3950/7330] lr: 1.000e-05, eta: 1:57:24, time: 0.635, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0338, loss_cls: 0.1434, acc: 94.5352, loss_bbox: 0.1998, loss_mask: 0.2134, loss: 0.6027 2024-06-01 13:01:11,467 - mmdet - INFO - Epoch [11][4000/7330] lr: 1.000e-05, eta: 1:56:51, time: 0.644, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0396, loss_cls: 0.1533, acc: 94.1221, loss_bbox: 0.2147, loss_mask: 0.2196, loss: 0.6416 2024-06-01 13:01:47,913 - mmdet - INFO - Epoch [11][4050/7330] lr: 1.000e-05, eta: 1:56:19, time: 0.729, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0355, loss_cls: 0.1424, acc: 94.4424, loss_bbox: 0.1989, loss_mask: 0.2167, loss: 0.6062 2024-06-01 13:02:19,953 - mmdet - INFO - Epoch [11][4100/7330] lr: 1.000e-05, eta: 1:55:46, time: 0.641, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0357, loss_cls: 0.1523, acc: 94.1509, loss_bbox: 0.2098, loss_mask: 0.2201, loss: 0.6313 2024-06-01 13:02:52,291 - mmdet - INFO - Epoch [11][4150/7330] lr: 1.000e-05, eta: 1:55:13, time: 0.647, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0360, loss_cls: 0.1459, acc: 94.4487, loss_bbox: 0.2015, loss_mask: 0.2156, loss: 0.6123 2024-06-01 13:03:27,039 - mmdet - INFO - Epoch [11][4200/7330] lr: 1.000e-05, eta: 1:54:40, time: 0.695, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0358, loss_cls: 0.1445, acc: 94.5251, loss_bbox: 0.1987, loss_mask: 0.2174, loss: 0.6106 2024-06-01 13:04:06,052 - mmdet - INFO - Epoch [11][4250/7330] lr: 1.000e-05, eta: 1:54:08, time: 0.780, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0351, loss_cls: 0.1488, acc: 94.2871, loss_bbox: 0.2005, loss_mask: 0.2145, loss: 0.6111 2024-06-01 13:04:37,997 - mmdet - INFO - Epoch [11][4300/7330] lr: 1.000e-05, eta: 1:53:35, time: 0.639, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0334, loss_cls: 0.1433, acc: 94.4343, loss_bbox: 0.2001, loss_mask: 0.2107, loss: 0.6002 2024-06-01 13:05:10,572 - mmdet - INFO - Epoch [11][4350/7330] lr: 1.000e-05, eta: 1:53:02, time: 0.652, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0374, loss_cls: 0.1530, acc: 94.0923, loss_bbox: 0.2081, loss_mask: 0.2207, loss: 0.6332 2024-06-01 13:05:42,776 - mmdet - INFO - Epoch [11][4400/7330] lr: 1.000e-05, eta: 1:52:29, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0366, loss_cls: 0.1503, acc: 94.2273, loss_bbox: 0.2111, loss_mask: 0.2186, loss: 0.6309 2024-06-01 13:06:15,051 - mmdet - INFO - Epoch [11][4450/7330] lr: 1.000e-05, eta: 1:51:56, time: 0.646, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0353, loss_cls: 0.1440, acc: 94.5110, loss_bbox: 0.2009, loss_mask: 0.2102, loss: 0.6022 2024-06-01 13:06:51,950 - mmdet - INFO - Epoch [11][4500/7330] lr: 1.000e-05, eta: 1:51:24, time: 0.738, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0367, loss_cls: 0.1492, acc: 94.3933, loss_bbox: 0.2065, loss_mask: 0.2171, loss: 0.6232 2024-06-01 13:07:26,274 - mmdet - INFO - Epoch [11][4550/7330] lr: 1.000e-05, eta: 1:50:51, time: 0.686, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0339, loss_cls: 0.1411, acc: 94.6409, loss_bbox: 0.1923, loss_mask: 0.2104, loss: 0.5894 2024-06-01 13:08:01,107 - mmdet - INFO - Epoch [11][4600/7330] lr: 1.000e-05, eta: 1:50:18, time: 0.697, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0339, loss_cls: 0.1426, acc: 94.5476, loss_bbox: 0.1982, loss_mask: 0.2105, loss: 0.5969 2024-06-01 13:08:33,322 - mmdet - INFO - Epoch [11][4650/7330] lr: 1.000e-05, eta: 1:49:45, time: 0.644, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0357, loss_cls: 0.1547, acc: 94.0466, loss_bbox: 0.2137, loss_mask: 0.2189, loss: 0.6367 2024-06-01 13:09:08,036 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 13:09:08,036 - mmdet - INFO - Epoch [11][4700/7330] lr: 1.000e-05, eta: 1:49:13, time: 0.694, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0352, loss_cls: 0.1438, acc: 94.5447, loss_bbox: 0.1965, loss_mask: 0.2107, loss: 0.5990 2024-06-01 13:09:40,549 - mmdet - INFO - Epoch [11][4750/7330] lr: 1.000e-05, eta: 1:48:40, time: 0.650, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0355, loss_cls: 0.1440, acc: 94.4465, loss_bbox: 0.1992, loss_mask: 0.2145, loss: 0.6056 2024-06-01 13:10:12,653 - mmdet - INFO - Epoch [11][4800/7330] lr: 1.000e-05, eta: 1:48:07, time: 0.642, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0330, loss_cls: 0.1417, acc: 94.4871, loss_bbox: 0.1942, loss_mask: 0.2101, loss: 0.5910 2024-06-01 13:10:45,138 - mmdet - INFO - Epoch [11][4850/7330] lr: 1.000e-05, eta: 1:47:34, time: 0.650, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0359, loss_cls: 0.1461, acc: 94.3650, loss_bbox: 0.1988, loss_mask: 0.2156, loss: 0.6101 2024-06-01 13:11:17,250 - mmdet - INFO - Epoch [11][4900/7330] lr: 1.000e-05, eta: 1:47:01, time: 0.642, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0367, loss_cls: 0.1495, acc: 94.3240, loss_bbox: 0.2055, loss_mask: 0.2159, loss: 0.6208 2024-06-01 13:11:49,964 - mmdet - INFO - Epoch [11][4950/7330] lr: 1.000e-05, eta: 1:46:28, time: 0.654, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0384, loss_cls: 0.1519, acc: 94.1477, loss_bbox: 0.2101, loss_mask: 0.2197, loss: 0.6340 2024-06-01 13:12:22,470 - mmdet - INFO - Epoch [11][5000/7330] lr: 1.000e-05, eta: 1:45:55, time: 0.650, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0378, loss_cls: 0.1486, acc: 94.2944, loss_bbox: 0.2098, loss_mask: 0.2200, loss: 0.6303 2024-06-01 13:12:54,639 - mmdet - INFO - Epoch [11][5050/7330] lr: 1.000e-05, eta: 1:45:22, time: 0.643, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0348, loss_cls: 0.1440, acc: 94.5015, loss_bbox: 0.2001, loss_mask: 0.2139, loss: 0.6050 2024-06-01 13:13:29,679 - mmdet - INFO - Epoch [11][5100/7330] lr: 1.000e-05, eta: 1:44:49, time: 0.701, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0363, loss_cls: 0.1514, acc: 94.1816, loss_bbox: 0.2108, loss_mask: 0.2147, loss: 0.6263 2024-06-01 13:14:02,010 - mmdet - INFO - Epoch [11][5150/7330] lr: 1.000e-05, eta: 1:44:16, time: 0.647, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0367, loss_cls: 0.1512, acc: 94.2312, loss_bbox: 0.2079, loss_mask: 0.2202, loss: 0.6295 2024-06-01 13:14:34,336 - mmdet - INFO - Epoch [11][5200/7330] lr: 1.000e-05, eta: 1:43:43, time: 0.646, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0330, loss_cls: 0.1361, acc: 94.7512, loss_bbox: 0.1921, loss_mask: 0.2047, loss: 0.5776 2024-06-01 13:15:09,337 - mmdet - INFO - Epoch [11][5250/7330] lr: 1.000e-05, eta: 1:43:11, time: 0.700, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0382, loss_cls: 0.1485, acc: 94.3406, loss_bbox: 0.2028, loss_mask: 0.2146, loss: 0.6176 2024-06-01 13:15:46,815 - mmdet - INFO - Epoch [11][5300/7330] lr: 1.000e-05, eta: 1:42:38, time: 0.750, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0341, loss_cls: 0.1422, acc: 94.6218, loss_bbox: 0.1964, loss_mask: 0.2108, loss: 0.5948 2024-06-01 13:16:19,430 - mmdet - INFO - Epoch [11][5350/7330] lr: 1.000e-05, eta: 1:42:05, time: 0.652, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0355, loss_cls: 0.1457, acc: 94.4502, loss_bbox: 0.1967, loss_mask: 0.2150, loss: 0.6059 2024-06-01 13:16:51,606 - mmdet - INFO - Epoch [11][5400/7330] lr: 1.000e-05, eta: 1:41:32, time: 0.643, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0346, loss_cls: 0.1406, acc: 94.5620, loss_bbox: 0.1939, loss_mask: 0.2135, loss: 0.5937 2024-06-01 13:17:23,662 - mmdet - INFO - Epoch [11][5450/7330] lr: 1.000e-05, eta: 1:40:59, time: 0.641, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0343, loss_cls: 0.1455, acc: 94.4033, loss_bbox: 0.1971, loss_mask: 0.2085, loss: 0.5990 2024-06-01 13:17:55,849 - mmdet - INFO - Epoch [11][5500/7330] lr: 1.000e-05, eta: 1:40:26, time: 0.644, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0347, loss_cls: 0.1422, acc: 94.5581, loss_bbox: 0.1989, loss_mask: 0.2099, loss: 0.5977 2024-06-01 13:18:30,326 - mmdet - INFO - Epoch [11][5550/7330] lr: 1.000e-05, eta: 1:39:54, time: 0.690, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0373, loss_cls: 0.1478, acc: 94.3694, loss_bbox: 0.2036, loss_mask: 0.2181, loss: 0.6208 2024-06-01 13:19:04,731 - mmdet - INFO - Epoch [11][5600/7330] lr: 1.000e-05, eta: 1:39:21, time: 0.688, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0351, loss_cls: 0.1514, acc: 94.3103, loss_bbox: 0.1999, loss_mask: 0.2097, loss: 0.6099 2024-06-01 13:19:40,550 - mmdet - INFO - Epoch [11][5650/7330] lr: 1.000e-05, eta: 1:38:48, time: 0.716, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0353, loss_cls: 0.1410, acc: 94.5537, loss_bbox: 0.1988, loss_mask: 0.2083, loss: 0.5960 2024-06-01 13:20:12,799 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 13:20:12,799 - mmdet - INFO - Epoch [11][5700/7330] lr: 1.000e-05, eta: 1:38:15, time: 0.645, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0348, loss_cls: 0.1464, acc: 94.3247, loss_bbox: 0.2023, loss_mask: 0.2163, loss: 0.6125 2024-06-01 13:20:47,588 - mmdet - INFO - Epoch [11][5750/7330] lr: 1.000e-05, eta: 1:37:43, time: 0.696, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0366, loss_cls: 0.1528, acc: 94.0940, loss_bbox: 0.2107, loss_mask: 0.2229, loss: 0.6365 2024-06-01 13:21:20,107 - mmdet - INFO - Epoch [11][5800/7330] lr: 1.000e-05, eta: 1:37:10, time: 0.650, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0336, loss_cls: 0.1446, acc: 94.4819, loss_bbox: 0.1962, loss_mask: 0.2130, loss: 0.6001 2024-06-01 13:21:52,393 - mmdet - INFO - Epoch [11][5850/7330] lr: 1.000e-05, eta: 1:36:37, time: 0.646, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0345, loss_cls: 0.1534, acc: 94.1770, loss_bbox: 0.2070, loss_mask: 0.2184, loss: 0.6274 2024-06-01 13:22:24,679 - mmdet - INFO - Epoch [11][5900/7330] lr: 1.000e-05, eta: 1:36:04, time: 0.646, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0353, loss_cls: 0.1425, acc: 94.5886, loss_bbox: 0.1987, loss_mask: 0.2148, loss: 0.6038 2024-06-01 13:22:56,942 - mmdet - INFO - Epoch [11][5950/7330] lr: 1.000e-05, eta: 1:35:31, time: 0.645, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0371, loss_cls: 0.1467, acc: 94.3687, loss_bbox: 0.2031, loss_mask: 0.2134, loss: 0.6135 2024-06-01 13:23:29,032 - mmdet - INFO - Epoch [11][6000/7330] lr: 1.000e-05, eta: 1:34:58, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0335, loss_cls: 0.1430, acc: 94.5837, loss_bbox: 0.1937, loss_mask: 0.2070, loss: 0.5891 2024-06-01 13:24:01,139 - mmdet - INFO - Epoch [11][6050/7330] lr: 1.000e-05, eta: 1:34:25, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0372, loss_cls: 0.1507, acc: 94.2314, loss_bbox: 0.2061, loss_mask: 0.2183, loss: 0.6256 2024-06-01 13:24:33,303 - mmdet - INFO - Epoch [11][6100/7330] lr: 1.000e-05, eta: 1:33:52, time: 0.643, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0351, loss_cls: 0.1451, acc: 94.4873, loss_bbox: 0.1977, loss_mask: 0.2148, loss: 0.6047 2024-06-01 13:25:07,754 - mmdet - INFO - Epoch [11][6150/7330] lr: 1.000e-05, eta: 1:33:19, time: 0.689, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0367, loss_cls: 0.1452, acc: 94.4697, loss_bbox: 0.2009, loss_mask: 0.2112, loss: 0.6069 2024-06-01 13:25:40,130 - mmdet - INFO - Epoch [11][6200/7330] lr: 1.000e-05, eta: 1:32:46, time: 0.648, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0356, loss_cls: 0.1446, acc: 94.4670, loss_bbox: 0.1976, loss_mask: 0.2091, loss: 0.5987 2024-06-01 13:26:12,580 - mmdet - INFO - Epoch [11][6250/7330] lr: 1.000e-05, eta: 1:32:13, time: 0.649, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0351, loss_cls: 0.1445, acc: 94.4331, loss_bbox: 0.1974, loss_mask: 0.2117, loss: 0.6008 2024-06-01 13:26:47,303 - mmdet - INFO - Epoch [11][6300/7330] lr: 1.000e-05, eta: 1:31:41, time: 0.694, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0346, loss_cls: 0.1435, acc: 94.4626, loss_bbox: 0.1993, loss_mask: 0.2220, loss: 0.6119 2024-06-01 13:27:24,410 - mmdet - INFO - Epoch [11][6350/7330] lr: 1.000e-05, eta: 1:31:08, time: 0.742, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0365, loss_cls: 0.1538, acc: 94.1238, loss_bbox: 0.2049, loss_mask: 0.2168, loss: 0.6258 2024-06-01 13:27:56,949 - mmdet - INFO - Epoch [11][6400/7330] lr: 1.000e-05, eta: 1:30:35, time: 0.651, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0359, loss_cls: 0.1460, acc: 94.4248, loss_bbox: 0.1984, loss_mask: 0.2087, loss: 0.6020 2024-06-01 13:28:29,621 - mmdet - INFO - Epoch [11][6450/7330] lr: 1.000e-05, eta: 1:30:02, time: 0.653, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0384, loss_cls: 0.1525, acc: 94.1594, loss_bbox: 0.2077, loss_mask: 0.2139, loss: 0.6274 2024-06-01 13:29:01,676 - mmdet - INFO - Epoch [11][6500/7330] lr: 1.000e-05, eta: 1:29:29, time: 0.642, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0350, loss_cls: 0.1434, acc: 94.4543, loss_bbox: 0.2029, loss_mask: 0.2205, loss: 0.6149 2024-06-01 13:29:33,715 - mmdet - INFO - Epoch [11][6550/7330] lr: 1.000e-05, eta: 1:28:56, time: 0.641, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0341, loss_cls: 0.1374, acc: 94.7056, loss_bbox: 0.1940, loss_mask: 0.2123, loss: 0.5904 2024-06-01 13:30:08,833 - mmdet - INFO - Epoch [11][6600/7330] lr: 1.000e-05, eta: 1:28:24, time: 0.702, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0375, loss_cls: 0.1509, acc: 94.1636, loss_bbox: 0.2094, loss_mask: 0.2158, loss: 0.6278 2024-06-01 13:30:45,036 - mmdet - INFO - Epoch [11][6650/7330] lr: 1.000e-05, eta: 1:27:51, time: 0.724, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0361, loss_cls: 0.1487, acc: 94.2322, loss_bbox: 0.2098, loss_mask: 0.2136, loss: 0.6221 2024-06-01 13:31:20,443 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 13:31:20,443 - mmdet - INFO - Epoch [11][6700/7330] lr: 1.000e-05, eta: 1:27:18, time: 0.708, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0374, loss_cls: 0.1511, acc: 94.2017, loss_bbox: 0.2081, loss_mask: 0.2208, loss: 0.6302 2024-06-01 13:31:53,206 - mmdet - INFO - Epoch [11][6750/7330] lr: 1.000e-05, eta: 1:26:45, time: 0.655, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0369, loss_cls: 0.1537, acc: 94.1448, loss_bbox: 0.2067, loss_mask: 0.2163, loss: 0.6274 2024-06-01 13:32:27,996 - mmdet - INFO - Epoch [11][6800/7330] lr: 1.000e-05, eta: 1:26:13, time: 0.696, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0348, loss_cls: 0.1455, acc: 94.4043, loss_bbox: 0.2003, loss_mask: 0.2131, loss: 0.6058 2024-06-01 13:33:00,719 - mmdet - INFO - Epoch [11][6850/7330] lr: 1.000e-05, eta: 1:25:40, time: 0.654, data_time: 0.041, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0371, loss_cls: 0.1535, acc: 94.1794, loss_bbox: 0.2089, loss_mask: 0.2216, loss: 0.6347 2024-06-01 13:33:33,184 - mmdet - INFO - Epoch [11][6900/7330] lr: 1.000e-05, eta: 1:25:07, time: 0.649, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0362, loss_cls: 0.1494, acc: 94.3538, loss_bbox: 0.2065, loss_mask: 0.2203, loss: 0.6251 2024-06-01 13:34:05,617 - mmdet - INFO - Epoch [11][6950/7330] lr: 1.000e-05, eta: 1:24:34, time: 0.649, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0345, loss_cls: 0.1457, acc: 94.4404, loss_bbox: 0.1944, loss_mask: 0.2146, loss: 0.6022 2024-06-01 13:34:37,862 - mmdet - INFO - Epoch [11][7000/7330] lr: 1.000e-05, eta: 1:24:01, time: 0.645, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0355, loss_cls: 0.1448, acc: 94.4299, loss_bbox: 0.1963, loss_mask: 0.2118, loss: 0.6021 2024-06-01 13:35:10,078 - mmdet - INFO - Epoch [11][7050/7330] lr: 1.000e-05, eta: 1:23:28, time: 0.644, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0356, loss_cls: 0.1479, acc: 94.3120, loss_bbox: 0.2060, loss_mask: 0.2142, loss: 0.6165 2024-06-01 13:35:42,371 - mmdet - INFO - Epoch [11][7100/7330] lr: 1.000e-05, eta: 1:22:55, time: 0.646, data_time: 0.039, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0368, loss_cls: 0.1483, acc: 94.3013, loss_bbox: 0.2038, loss_mask: 0.2154, loss: 0.6175 2024-06-01 13:36:14,317 - mmdet - INFO - Epoch [11][7150/7330] lr: 1.000e-05, eta: 1:22:22, time: 0.639, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0342, loss_cls: 0.1425, acc: 94.5530, loss_bbox: 0.1932, loss_mask: 0.2091, loss: 0.5909 2024-06-01 13:36:48,907 - mmdet - INFO - Epoch [11][7200/7330] lr: 1.000e-05, eta: 1:21:49, time: 0.692, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0355, loss_cls: 0.1417, acc: 94.5500, loss_bbox: 0.1960, loss_mask: 0.2164, loss: 0.6018 2024-06-01 13:37:21,401 - mmdet - INFO - Epoch [11][7250/7330] lr: 1.000e-05, eta: 1:21:16, time: 0.650, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0355, loss_cls: 0.1426, acc: 94.4827, loss_bbox: 0.1967, loss_mask: 0.2136, loss: 0.5998 2024-06-01 13:37:53,888 - mmdet - INFO - Epoch [11][7300/7330] lr: 1.000e-05, eta: 1:20:43, time: 0.650, data_time: 0.041, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0361, loss_cls: 0.1472, acc: 94.3521, loss_bbox: 0.2039, loss_mask: 0.2131, loss: 0.6134 2024-06-01 13:38:16,175 - mmdet - INFO - Saving checkpoint at 11 epochs 2024-06-01 13:39:53,995 - mmdet - INFO - Evaluating bbox... 2024-06-01 13:40:18,033 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.483 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.712 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.525 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.294 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.525 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.659 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.592 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.592 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.592 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.397 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.639 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.768 2024-06-01 13:40:18,034 - mmdet - INFO - Evaluating segm... 2024-06-01 13:40:40,136 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.425 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.673 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.451 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.205 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.461 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.647 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.528 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.528 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.528 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.318 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.577 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.724 2024-06-01 13:40:40,503 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 13:40:40,505 - mmdet - INFO - Epoch(val) [11][625] bbox_mAP: 0.4830, bbox_mAP_50: 0.7120, bbox_mAP_75: 0.5250, bbox_mAP_s: 0.2940, bbox_mAP_m: 0.5250, bbox_mAP_l: 0.6590, bbox_mAP_copypaste: 0.483 0.712 0.525 0.294 0.525 0.659, segm_mAP: 0.4250, segm_mAP_50: 0.6730, segm_mAP_75: 0.4510, segm_mAP_s: 0.2050, segm_mAP_m: 0.4610, segm_mAP_l: 0.6470, segm_mAP_copypaste: 0.425 0.673 0.451 0.205 0.461 0.647 2024-06-01 13:41:22,404 - mmdet - INFO - Epoch [12][50/7330] lr: 1.000e-06, eta: 1:19:50, time: 0.838, data_time: 0.104, memory: 13277, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0346, loss_cls: 0.1398, acc: 94.5952, loss_bbox: 0.1941, loss_mask: 0.2083, loss: 0.5889 2024-06-01 13:41:56,691 - mmdet - INFO - Epoch [12][100/7330] lr: 1.000e-06, eta: 1:19:17, time: 0.686, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0356, loss_cls: 0.1400, acc: 94.6382, loss_bbox: 0.1969, loss_mask: 0.2113, loss: 0.5955 2024-06-01 13:42:28,765 - mmdet - INFO - Epoch [12][150/7330] lr: 1.000e-06, eta: 1:18:44, time: 0.641, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0365, loss_cls: 0.1446, acc: 94.4629, loss_bbox: 0.2039, loss_mask: 0.2160, loss: 0.6136 2024-06-01 13:43:03,448 - mmdet - INFO - Epoch [12][200/7330] lr: 1.000e-06, eta: 1:18:11, time: 0.694, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0375, loss_cls: 0.1487, acc: 94.2959, loss_bbox: 0.2030, loss_mask: 0.2180, loss: 0.6208 2024-06-01 13:43:35,532 - mmdet - INFO - Epoch [12][250/7330] lr: 1.000e-06, eta: 1:17:38, time: 0.642, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0362, loss_cls: 0.1473, acc: 94.4092, loss_bbox: 0.1983, loss_mask: 0.2128, loss: 0.6085 2024-06-01 13:44:07,984 - mmdet - INFO - Epoch [12][300/7330] lr: 1.000e-06, eta: 1:17:05, time: 0.649, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0347, loss_cls: 0.1441, acc: 94.4231, loss_bbox: 0.2014, loss_mask: 0.2122, loss: 0.6057 2024-06-01 13:44:40,457 - mmdet - INFO - Epoch [12][350/7330] lr: 1.000e-06, eta: 1:16:32, time: 0.649, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0374, loss_cls: 0.1438, acc: 94.4487, loss_bbox: 0.2011, loss_mask: 0.2177, loss: 0.6131 2024-06-01 13:45:12,431 - mmdet - INFO - Epoch [12][400/7330] lr: 1.000e-06, eta: 1:15:59, time: 0.640, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0365, loss_cls: 0.1481, acc: 94.2949, loss_bbox: 0.2032, loss_mask: 0.2157, loss: 0.6168 2024-06-01 13:45:44,515 - mmdet - INFO - Epoch [12][450/7330] lr: 1.000e-06, eta: 1:15:26, time: 0.642, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0352, loss_cls: 0.1452, acc: 94.3320, loss_bbox: 0.2020, loss_mask: 0.2138, loss: 0.6090 2024-06-01 13:46:16,685 - mmdet - INFO - Epoch [12][500/7330] lr: 1.000e-06, eta: 1:14:53, time: 0.643, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0332, loss_cls: 0.1401, acc: 94.5615, loss_bbox: 0.1926, loss_mask: 0.2078, loss: 0.5865 2024-06-01 13:46:48,997 - mmdet - INFO - Epoch [12][550/7330] lr: 1.000e-06, eta: 1:14:21, time: 0.646, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0359, loss_cls: 0.1408, acc: 94.6365, loss_bbox: 0.1905, loss_mask: 0.2102, loss: 0.5899 2024-06-01 13:47:21,700 - mmdet - INFO - Epoch [12][600/7330] lr: 1.000e-06, eta: 1:13:48, time: 0.654, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0392, loss_cls: 0.1490, acc: 94.2705, loss_bbox: 0.2060, loss_mask: 0.2177, loss: 0.6268 2024-06-01 13:47:53,568 - mmdet - INFO - Epoch [12][650/7330] lr: 1.000e-06, eta: 1:13:15, time: 0.637, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0339, loss_cls: 0.1408, acc: 94.5342, loss_bbox: 0.1947, loss_mask: 0.2139, loss: 0.5949 2024-06-01 13:48:25,521 - mmdet - INFO - Epoch [12][700/7330] lr: 1.000e-06, eta: 1:12:42, time: 0.639, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0363, loss_cls: 0.1438, acc: 94.5066, loss_bbox: 0.2035, loss_mask: 0.2170, loss: 0.6136 2024-06-01 13:48:57,723 - mmdet - INFO - Epoch [12][750/7330] lr: 1.000e-06, eta: 1:12:09, time: 0.644, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0368, loss_cls: 0.1460, acc: 94.3662, loss_bbox: 0.2022, loss_mask: 0.2195, loss: 0.6173 2024-06-01 13:49:32,846 - mmdet - INFO - Epoch [12][800/7330] lr: 1.000e-06, eta: 1:11:36, time: 0.703, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0344, loss_cls: 0.1438, acc: 94.3755, loss_bbox: 0.1994, loss_mask: 0.2138, loss: 0.6030 2024-06-01 13:50:05,209 - mmdet - INFO - Epoch [12][850/7330] lr: 1.000e-06, eta: 1:11:03, time: 0.647, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0361, loss_cls: 0.1442, acc: 94.4058, loss_bbox: 0.2063, loss_mask: 0.2160, loss: 0.6143 2024-06-01 13:50:40,243 - mmdet - INFO - Epoch [12][900/7330] lr: 1.000e-06, eta: 1:10:30, time: 0.701, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0368, loss_cls: 0.1464, acc: 94.3799, loss_bbox: 0.1996, loss_mask: 0.2162, loss: 0.6117 2024-06-01 13:51:15,448 - mmdet - INFO - Epoch [12][950/7330] lr: 1.000e-06, eta: 1:09:58, time: 0.704, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0366, loss_cls: 0.1485, acc: 94.2942, loss_bbox: 0.2053, loss_mask: 0.2128, loss: 0.6155 2024-06-01 13:51:47,993 - mmdet - INFO - Epoch [12][1000/7330] lr: 1.000e-06, eta: 1:09:25, time: 0.651, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0371, loss_cls: 0.1473, acc: 94.3599, loss_bbox: 0.2024, loss_mask: 0.2157, loss: 0.6151 2024-06-01 13:52:22,675 - mmdet - INFO - Epoch [12][1050/7330] lr: 1.000e-06, eta: 1:08:52, time: 0.694, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0374, loss_cls: 0.1503, acc: 94.1687, loss_bbox: 0.2064, loss_mask: 0.2197, loss: 0.6274 2024-06-01 13:52:54,967 - mmdet - INFO - Epoch [12][1100/7330] lr: 1.000e-06, eta: 1:08:19, time: 0.646, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0356, loss_cls: 0.1421, acc: 94.5615, loss_bbox: 0.1945, loss_mask: 0.2097, loss: 0.5938 2024-06-01 13:53:29,061 - mmdet - INFO - Epoch [12][1150/7330] lr: 1.000e-06, eta: 1:07:46, time: 0.682, data_time: 0.024, memory: 13277, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0332, loss_cls: 0.1371, acc: 94.7366, loss_bbox: 0.1908, loss_mask: 0.2070, loss: 0.5792 2024-06-01 13:54:01,101 - mmdet - INFO - Epoch [12][1200/7330] lr: 1.000e-06, eta: 1:07:13, time: 0.641, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0367, loss_cls: 0.1469, acc: 94.3198, loss_bbox: 0.2012, loss_mask: 0.2131, loss: 0.6105 2024-06-01 13:54:36,336 - mmdet - INFO - Epoch [12][1250/7330] lr: 1.000e-06, eta: 1:06:40, time: 0.705, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0338, loss_cls: 0.1390, acc: 94.7466, loss_bbox: 0.1947, loss_mask: 0.2111, loss: 0.5913 2024-06-01 13:55:10,966 - mmdet - INFO - Epoch [12][1300/7330] lr: 1.000e-06, eta: 1:06:08, time: 0.693, data_time: 0.025, memory: 13277, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0338, loss_cls: 0.1377, acc: 94.7051, loss_bbox: 0.1889, loss_mask: 0.2064, loss: 0.5785 2024-06-01 13:55:45,592 - mmdet - INFO - Epoch [12][1350/7330] lr: 1.000e-06, eta: 1:05:35, time: 0.693, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0341, loss_cls: 0.1455, acc: 94.4180, loss_bbox: 0.1978, loss_mask: 0.2120, loss: 0.6020 2024-06-01 13:56:17,812 - mmdet - INFO - Epoch [12][1400/7330] lr: 1.000e-06, eta: 1:05:02, time: 0.644, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0367, loss_cls: 0.1475, acc: 94.2358, loss_bbox: 0.2037, loss_mask: 0.2175, loss: 0.6183 2024-06-01 13:56:50,042 - mmdet - INFO - Epoch [12][1450/7330] lr: 1.000e-06, eta: 1:04:29, time: 0.645, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0354, loss_cls: 0.1414, acc: 94.5618, loss_bbox: 0.1936, loss_mask: 0.2134, loss: 0.5964 2024-06-01 13:57:22,304 - mmdet - INFO - Epoch [12][1500/7330] lr: 1.000e-06, eta: 1:03:56, time: 0.645, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0346, loss_cls: 0.1462, acc: 94.3665, loss_bbox: 0.2064, loss_mask: 0.2137, loss: 0.6141 2024-06-01 13:57:54,578 - mmdet - INFO - Epoch [12][1550/7330] lr: 1.000e-06, eta: 1:03:23, time: 0.645, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0351, loss_cls: 0.1505, acc: 94.1980, loss_bbox: 0.2075, loss_mask: 0.2176, loss: 0.6234 2024-06-01 13:58:26,859 - mmdet - INFO - Epoch [12][1600/7330] lr: 1.000e-06, eta: 1:02:50, time: 0.646, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0329, loss_cls: 0.1415, acc: 94.5312, loss_bbox: 0.1966, loss_mask: 0.2117, loss: 0.5948 2024-06-01 13:58:59,065 - mmdet - INFO - Epoch [12][1650/7330] lr: 1.000e-06, eta: 1:02:17, time: 0.644, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0387, loss_cls: 0.1516, acc: 94.1980, loss_bbox: 0.2110, loss_mask: 0.2178, loss: 0.6330 2024-06-01 13:59:31,463 - mmdet - INFO - Epoch [12][1700/7330] lr: 1.000e-06, eta: 1:01:44, time: 0.648, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0350, loss_cls: 0.1444, acc: 94.4214, loss_bbox: 0.1989, loss_mask: 0.2105, loss: 0.6007 2024-06-01 14:00:04,036 - mmdet - INFO - Epoch [12][1750/7330] lr: 1.000e-06, eta: 1:01:11, time: 0.651, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0376, loss_cls: 0.1509, acc: 94.1404, loss_bbox: 0.2082, loss_mask: 0.2119, loss: 0.6213 2024-06-01 14:00:36,420 - mmdet - INFO - Epoch [12][1800/7330] lr: 1.000e-06, eta: 1:00:38, time: 0.648, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0366, loss_cls: 0.1498, acc: 94.2356, loss_bbox: 0.2048, loss_mask: 0.2122, loss: 0.6162 2024-06-01 14:01:11,823 - mmdet - INFO - Epoch [12][1850/7330] lr: 1.000e-06, eta: 1:00:06, time: 0.708, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0360, loss_cls: 0.1408, acc: 94.6211, loss_bbox: 0.1984, loss_mask: 0.2137, loss: 0.6007 2024-06-01 14:01:44,458 - mmdet - INFO - Epoch [12][1900/7330] lr: 1.000e-06, eta: 0:59:33, time: 0.653, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0364, loss_cls: 0.1451, acc: 94.4109, loss_bbox: 0.2039, loss_mask: 0.2180, loss: 0.6162 2024-06-01 14:02:20,065 - mmdet - INFO - Epoch [12][1950/7330] lr: 1.000e-06, eta: 0:59:00, time: 0.712, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0352, loss_cls: 0.1456, acc: 94.3748, loss_bbox: 0.1978, loss_mask: 0.2118, loss: 0.6034 2024-06-01 14:02:55,531 - mmdet - INFO - Epoch [12][2000/7330] lr: 1.000e-06, eta: 0:58:27, time: 0.709, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0371, loss_cls: 0.1476, acc: 94.2798, loss_bbox: 0.2041, loss_mask: 0.2122, loss: 0.6139 2024-06-01 14:03:27,816 - mmdet - INFO - Epoch [12][2050/7330] lr: 1.000e-06, eta: 0:57:54, time: 0.646, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0360, loss_cls: 0.1433, acc: 94.5156, loss_bbox: 0.2001, loss_mask: 0.2156, loss: 0.6068 2024-06-01 14:04:02,506 - mmdet - INFO - Epoch [12][2100/7330] lr: 1.000e-06, eta: 0:57:22, time: 0.694, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0339, loss_cls: 0.1431, acc: 94.5022, loss_bbox: 0.1976, loss_mask: 0.2121, loss: 0.5988 2024-06-01 14:04:34,706 - mmdet - INFO - Epoch [12][2150/7330] lr: 1.000e-06, eta: 0:56:49, time: 0.644, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0373, loss_cls: 0.1449, acc: 94.4812, loss_bbox: 0.1997, loss_mask: 0.2121, loss: 0.6078 2024-06-01 14:05:09,157 - mmdet - INFO - Epoch [12][2200/7330] lr: 1.000e-06, eta: 0:56:16, time: 0.689, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0361, loss_cls: 0.1456, acc: 94.3948, loss_bbox: 0.2060, loss_mask: 0.2129, loss: 0.6132 2024-06-01 14:05:41,613 - mmdet - INFO - Epoch [12][2250/7330] lr: 1.000e-06, eta: 0:55:43, time: 0.649, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0360, loss_cls: 0.1444, acc: 94.4792, loss_bbox: 0.2000, loss_mask: 0.2152, loss: 0.6080 2024-06-01 14:06:16,279 - mmdet - INFO - Epoch [12][2300/7330] lr: 1.000e-06, eta: 0:55:10, time: 0.693, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0376, loss_cls: 0.1487, acc: 94.2656, loss_bbox: 0.2038, loss_mask: 0.2112, loss: 0.6150 2024-06-01 14:06:50,806 - mmdet - INFO - Epoch [12][2350/7330] lr: 1.000e-06, eta: 0:54:37, time: 0.691, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0331, loss_cls: 0.1338, acc: 94.8013, loss_bbox: 0.1894, loss_mask: 0.2098, loss: 0.5774 2024-06-01 14:07:25,450 - mmdet - INFO - Epoch [12][2400/7330] lr: 1.000e-06, eta: 0:54:04, time: 0.693, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0356, loss_cls: 0.1411, acc: 94.5745, loss_bbox: 0.2013, loss_mask: 0.2146, loss: 0.6047 2024-06-01 14:07:57,740 - mmdet - INFO - Epoch [12][2450/7330] lr: 1.000e-06, eta: 0:53:31, time: 0.646, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0356, loss_cls: 0.1417, acc: 94.6306, loss_bbox: 0.1914, loss_mask: 0.2125, loss: 0.5944 2024-06-01 14:08:29,919 - mmdet - INFO - Epoch [12][2500/7330] lr: 1.000e-06, eta: 0:52:59, time: 0.644, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0338, loss_cls: 0.1427, acc: 94.4832, loss_bbox: 0.2002, loss_mask: 0.2099, loss: 0.5987 2024-06-01 14:09:02,510 - mmdet - INFO - Epoch [12][2550/7330] lr: 1.000e-06, eta: 0:52:26, time: 0.651, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0351, loss_cls: 0.1470, acc: 94.3157, loss_bbox: 0.1994, loss_mask: 0.2147, loss: 0.6099 2024-06-01 14:09:34,915 - mmdet - INFO - Epoch [12][2600/7330] lr: 1.000e-06, eta: 0:51:53, time: 0.649, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0354, loss_cls: 0.1441, acc: 94.4348, loss_bbox: 0.1999, loss_mask: 0.2112, loss: 0.6035 2024-06-01 14:10:07,377 - mmdet - INFO - Epoch [12][2650/7330] lr: 1.000e-06, eta: 0:51:20, time: 0.649, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0360, loss_cls: 0.1457, acc: 94.4136, loss_bbox: 0.2057, loss_mask: 0.2192, loss: 0.6195 2024-06-01 14:10:39,767 - mmdet - INFO - Epoch [12][2700/7330] lr: 1.000e-06, eta: 0:50:47, time: 0.648, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0363, loss_cls: 0.1409, acc: 94.6011, loss_bbox: 0.1974, loss_mask: 0.2134, loss: 0.5998 2024-06-01 14:11:12,129 - mmdet - INFO - Epoch [12][2750/7330] lr: 1.000e-06, eta: 0:50:14, time: 0.647, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0369, loss_cls: 0.1463, acc: 94.3335, loss_bbox: 0.2032, loss_mask: 0.2093, loss: 0.6091 2024-06-01 14:11:44,554 - mmdet - INFO - Epoch [12][2800/7330] lr: 1.000e-06, eta: 0:49:41, time: 0.648, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0368, loss_cls: 0.1463, acc: 94.4050, loss_bbox: 0.2012, loss_mask: 0.2133, loss: 0.6100 2024-06-01 14:12:16,737 - mmdet - INFO - Epoch [12][2850/7330] lr: 1.000e-06, eta: 0:49:08, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0343, loss_cls: 0.1398, acc: 94.6455, loss_bbox: 0.1936, loss_mask: 0.2088, loss: 0.5878 2024-06-01 14:12:51,944 - mmdet - INFO - Epoch [12][2900/7330] lr: 1.000e-06, eta: 0:48:35, time: 0.704, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0355, loss_cls: 0.1396, acc: 94.6804, loss_bbox: 0.1947, loss_mask: 0.2105, loss: 0.5929 2024-06-01 14:13:24,134 - mmdet - INFO - Epoch [12][2950/7330] lr: 1.000e-06, eta: 0:48:02, time: 0.644, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0357, loss_cls: 0.1483, acc: 94.3333, loss_bbox: 0.2039, loss_mask: 0.2147, loss: 0.6151 2024-06-01 14:13:59,314 - mmdet - INFO - Epoch [12][3000/7330] lr: 1.000e-06, eta: 0:47:29, time: 0.704, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0355, loss_cls: 0.1437, acc: 94.4802, loss_bbox: 0.2028, loss_mask: 0.2146, loss: 0.6087 2024-06-01 14:14:35,002 - mmdet - INFO - Epoch [12][3050/7330] lr: 1.000e-06, eta: 0:46:57, time: 0.714, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0357, loss_cls: 0.1414, acc: 94.5847, loss_bbox: 0.1976, loss_mask: 0.2130, loss: 0.6002 2024-06-01 14:15:07,556 - mmdet - INFO - Epoch [12][3100/7330] lr: 1.000e-06, eta: 0:46:24, time: 0.651, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0365, loss_cls: 0.1442, acc: 94.3599, loss_bbox: 0.2009, loss_mask: 0.2145, loss: 0.6101 2024-06-01 14:15:42,380 - mmdet - INFO - Epoch [12][3150/7330] lr: 1.000e-06, eta: 0:45:51, time: 0.696, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0355, loss_cls: 0.1447, acc: 94.4373, loss_bbox: 0.1991, loss_mask: 0.2098, loss: 0.6024 2024-06-01 14:16:14,844 - mmdet - INFO - Epoch [12][3200/7330] lr: 1.000e-06, eta: 0:45:18, time: 0.650, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0349, loss_cls: 0.1415, acc: 94.5237, loss_bbox: 0.1960, loss_mask: 0.2125, loss: 0.5980 2024-06-01 14:16:51,725 - mmdet - INFO - Epoch [12][3250/7330] lr: 1.000e-06, eta: 0:44:45, time: 0.738, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0359, loss_cls: 0.1465, acc: 94.3867, loss_bbox: 0.2031, loss_mask: 0.2100, loss: 0.6087 2024-06-01 14:17:23,685 - mmdet - INFO - Epoch [12][3300/7330] lr: 1.000e-06, eta: 0:44:12, time: 0.639, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0322, loss_cls: 0.1345, acc: 94.8308, loss_bbox: 0.1853, loss_mask: 0.2100, loss: 0.5741 2024-06-01 14:17:58,608 - mmdet - INFO - Epoch [12][3350/7330] lr: 1.000e-06, eta: 0:43:40, time: 0.698, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0371, loss_cls: 0.1488, acc: 94.2407, loss_bbox: 0.2043, loss_mask: 0.2132, loss: 0.6173 2024-06-01 14:18:33,028 - mmdet - INFO - Epoch [12][3400/7330] lr: 1.000e-06, eta: 0:43:07, time: 0.688, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0368, loss_cls: 0.1463, acc: 94.3557, loss_bbox: 0.2007, loss_mask: 0.2164, loss: 0.6142 2024-06-01 14:19:07,534 - mmdet - INFO - Epoch [12][3450/7330] lr: 1.000e-06, eta: 0:42:34, time: 0.690, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0349, loss_cls: 0.1407, acc: 94.5620, loss_bbox: 0.1980, loss_mask: 0.2106, loss: 0.5972 2024-06-01 14:19:39,703 - mmdet - INFO - Epoch [12][3500/7330] lr: 1.000e-06, eta: 0:42:01, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0345, loss_cls: 0.1467, acc: 94.4465, loss_bbox: 0.2012, loss_mask: 0.2173, loss: 0.6124 2024-06-01 14:20:11,757 - mmdet - INFO - Epoch [12][3550/7330] lr: 1.000e-06, eta: 0:41:28, time: 0.641, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0352, loss_cls: 0.1429, acc: 94.5693, loss_bbox: 0.1940, loss_mask: 0.2092, loss: 0.5929 2024-06-01 14:20:43,892 - mmdet - INFO - Epoch [12][3600/7330] lr: 1.000e-06, eta: 0:40:55, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0344, loss_cls: 0.1373, acc: 94.7454, loss_bbox: 0.1937, loss_mask: 0.2077, loss: 0.5843 2024-06-01 14:21:16,214 - mmdet - INFO - Epoch [12][3650/7330] lr: 1.000e-06, eta: 0:40:22, time: 0.646, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0332, loss_cls: 0.1378, acc: 94.7324, loss_bbox: 0.1927, loss_mask: 0.2060, loss: 0.5813 2024-06-01 14:21:48,512 - mmdet - INFO - Epoch [12][3700/7330] lr: 1.000e-06, eta: 0:39:49, time: 0.646, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0365, loss_cls: 0.1479, acc: 94.3286, loss_bbox: 0.2016, loss_mask: 0.2142, loss: 0.6145 2024-06-01 14:22:20,593 - mmdet - INFO - Epoch [12][3750/7330] lr: 1.000e-06, eta: 0:39:16, time: 0.642, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0352, loss_cls: 0.1413, acc: 94.5845, loss_bbox: 0.1988, loss_mask: 0.2144, loss: 0.6013 2024-06-01 14:22:52,799 - mmdet - INFO - Epoch [12][3800/7330] lr: 1.000e-06, eta: 0:38:43, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0348, loss_cls: 0.1448, acc: 94.4194, loss_bbox: 0.2018, loss_mask: 0.2125, loss: 0.6059 2024-06-01 14:23:24,972 - mmdet - INFO - Epoch [12][3850/7330] lr: 1.000e-06, eta: 0:38:10, time: 0.643, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0343, loss_cls: 0.1370, acc: 94.7529, loss_bbox: 0.1913, loss_mask: 0.2103, loss: 0.5848 2024-06-01 14:23:57,519 - mmdet - INFO - Epoch [12][3900/7330] lr: 1.000e-06, eta: 0:37:37, time: 0.651, data_time: 0.041, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0361, loss_cls: 0.1453, acc: 94.3796, loss_bbox: 0.2001, loss_mask: 0.2100, loss: 0.6043 2024-06-01 14:24:32,373 - mmdet - INFO - Epoch [12][3950/7330] lr: 1.000e-06, eta: 0:37:04, time: 0.697, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0366, loss_cls: 0.1460, acc: 94.3721, loss_bbox: 0.2023, loss_mask: 0.2179, loss: 0.6166 2024-06-01 14:25:04,506 - mmdet - INFO - Epoch [12][4000/7330] lr: 1.000e-06, eta: 0:36:32, time: 0.643, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0364, loss_cls: 0.1451, acc: 94.3396, loss_bbox: 0.2008, loss_mask: 0.2142, loss: 0.6084 2024-06-01 14:25:39,028 - mmdet - INFO - Epoch [12][4050/7330] lr: 1.000e-06, eta: 0:35:59, time: 0.690, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0325, loss_cls: 0.1348, acc: 94.8318, loss_bbox: 0.1876, loss_mask: 0.2065, loss: 0.5731 2024-06-01 14:26:14,174 - mmdet - INFO - Epoch [12][4100/7330] lr: 1.000e-06, eta: 0:35:26, time: 0.703, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0359, loss_cls: 0.1447, acc: 94.4016, loss_bbox: 0.2004, loss_mask: 0.2171, loss: 0.6110 2024-06-01 14:26:46,541 - mmdet - INFO - Epoch [12][4150/7330] lr: 1.000e-06, eta: 0:34:53, time: 0.647, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0354, loss_cls: 0.1461, acc: 94.3577, loss_bbox: 0.2022, loss_mask: 0.2096, loss: 0.6056 2024-06-01 14:27:21,248 - mmdet - INFO - Epoch [12][4200/7330] lr: 1.000e-06, eta: 0:34:20, time: 0.694, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0336, loss_cls: 0.1396, acc: 94.6895, loss_bbox: 0.1914, loss_mask: 0.2115, loss: 0.5875 2024-06-01 14:27:53,676 - mmdet - INFO - Epoch [12][4250/7330] lr: 1.000e-06, eta: 0:33:47, time: 0.649, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0357, loss_cls: 0.1481, acc: 94.2773, loss_bbox: 0.2056, loss_mask: 0.2144, loss: 0.6169 2024-06-01 14:28:28,889 - mmdet - INFO - Epoch [12][4300/7330] lr: 1.000e-06, eta: 0:33:14, time: 0.704, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0367, loss_cls: 0.1458, acc: 94.3306, loss_bbox: 0.1960, loss_mask: 0.2131, loss: 0.6038 2024-06-01 14:29:01,402 - mmdet - INFO - Epoch [12][4350/7330] lr: 1.000e-06, eta: 0:32:41, time: 0.650, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0337, loss_cls: 0.1411, acc: 94.5774, loss_bbox: 0.1934, loss_mask: 0.2106, loss: 0.5914 2024-06-01 14:29:38,267 - mmdet - INFO - Epoch [12][4400/7330] lr: 1.000e-06, eta: 0:32:09, time: 0.737, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0354, loss_cls: 0.1455, acc: 94.2896, loss_bbox: 0.2064, loss_mask: 0.2160, loss: 0.6155 2024-06-01 14:30:10,944 - mmdet - INFO - Epoch [12][4450/7330] lr: 1.000e-06, eta: 0:31:36, time: 0.654, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0355, loss_cls: 0.1481, acc: 94.4441, loss_bbox: 0.1995, loss_mask: 0.2138, loss: 0.6103 2024-06-01 14:30:45,662 - mmdet - INFO - Epoch [12][4500/7330] lr: 1.000e-06, eta: 0:31:03, time: 0.694, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0346, loss_cls: 0.1445, acc: 94.5073, loss_bbox: 0.1954, loss_mask: 0.2108, loss: 0.5976 2024-06-01 14:31:17,799 - mmdet - INFO - Epoch [12][4550/7330] lr: 1.000e-06, eta: 0:30:30, time: 0.643, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0346, loss_cls: 0.1393, acc: 94.6797, loss_bbox: 0.1930, loss_mask: 0.2078, loss: 0.5865 2024-06-01 14:31:50,005 - mmdet - INFO - Epoch [12][4600/7330] lr: 1.000e-06, eta: 0:29:57, time: 0.644, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0338, loss_cls: 0.1384, acc: 94.7039, loss_bbox: 0.1919, loss_mask: 0.2076, loss: 0.5835 2024-06-01 14:32:22,459 - mmdet - INFO - Epoch [12][4650/7330] lr: 1.000e-06, eta: 0:29:24, time: 0.649, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0348, loss_cls: 0.1405, acc: 94.5334, loss_bbox: 0.2003, loss_mask: 0.2176, loss: 0.6059 2024-06-01 14:32:54,826 - mmdet - INFO - Epoch [12][4700/7330] lr: 1.000e-06, eta: 0:28:51, time: 0.647, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0322, loss_cls: 0.1348, acc: 94.8113, loss_bbox: 0.1894, loss_mask: 0.2070, loss: 0.5750 2024-06-01 14:33:27,228 - mmdet - INFO - Epoch [12][4750/7330] lr: 1.000e-06, eta: 0:28:18, time: 0.648, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0373, loss_cls: 0.1495, acc: 94.2063, loss_bbox: 0.2077, loss_mask: 0.2200, loss: 0.6276 2024-06-01 14:33:59,497 - mmdet - INFO - Epoch [12][4800/7330] lr: 1.000e-06, eta: 0:27:45, time: 0.645, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0328, loss_cls: 0.1361, acc: 94.7583, loss_bbox: 0.1895, loss_mask: 0.2101, loss: 0.5811 2024-06-01 14:34:31,918 - mmdet - INFO - Epoch [12][4850/7330] lr: 1.000e-06, eta: 0:27:12, time: 0.648, data_time: 0.026, memory: 13277, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0329, loss_cls: 0.1352, acc: 94.7695, loss_bbox: 0.1902, loss_mask: 0.2120, loss: 0.5817 2024-06-01 14:35:04,164 - mmdet - INFO - Epoch [12][4900/7330] lr: 1.000e-06, eta: 0:26:39, time: 0.645, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0346, loss_cls: 0.1424, acc: 94.4116, loss_bbox: 0.2019, loss_mask: 0.2147, loss: 0.6057 2024-06-01 14:35:36,517 - mmdet - INFO - Epoch [12][4950/7330] lr: 1.000e-06, eta: 0:26:06, time: 0.647, data_time: 0.036, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0354, loss_cls: 0.1431, acc: 94.4473, loss_bbox: 0.1973, loss_mask: 0.2155, loss: 0.6039 2024-06-01 14:36:11,888 - mmdet - INFO - Epoch [12][5000/7330] lr: 1.000e-06, eta: 0:25:34, time: 0.707, data_time: 0.040, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0382, loss_cls: 0.1468, acc: 94.2944, loss_bbox: 0.2070, loss_mask: 0.2167, loss: 0.6212 2024-06-01 14:36:44,018 - mmdet - INFO - Epoch [12][5050/7330] lr: 1.000e-06, eta: 0:25:01, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0358, loss_cls: 0.1432, acc: 94.5273, loss_bbox: 0.1947, loss_mask: 0.2116, loss: 0.5991 2024-06-01 14:37:18,778 - mmdet - INFO - Epoch [12][5100/7330] lr: 1.000e-06, eta: 0:24:28, time: 0.695, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0360, loss_cls: 0.1484, acc: 94.2781, loss_bbox: 0.2007, loss_mask: 0.2145, loss: 0.6117 2024-06-01 14:37:53,348 - mmdet - INFO - Epoch [12][5150/7330] lr: 1.000e-06, eta: 0:23:55, time: 0.692, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0347, loss_cls: 0.1380, acc: 94.6680, loss_bbox: 0.1952, loss_mask: 0.2096, loss: 0.5897 2024-06-01 14:38:25,476 - mmdet - INFO - Epoch [12][5200/7330] lr: 1.000e-06, eta: 0:23:22, time: 0.643, data_time: 0.038, memory: 13277, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0345, loss_cls: 0.1445, acc: 94.4517, loss_bbox: 0.2027, loss_mask: 0.2099, loss: 0.6029 2024-06-01 14:38:59,573 - mmdet - INFO - Epoch [12][5250/7330] lr: 1.000e-06, eta: 0:22:49, time: 0.682, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0363, loss_cls: 0.1430, acc: 94.4866, loss_bbox: 0.2011, loss_mask: 0.2131, loss: 0.6060 2024-06-01 14:39:32,198 - mmdet - INFO - Epoch [12][5300/7330] lr: 1.000e-06, eta: 0:22:16, time: 0.652, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0354, loss_cls: 0.1451, acc: 94.3684, loss_bbox: 0.2005, loss_mask: 0.2149, loss: 0.6093 2024-06-01 14:40:07,874 - mmdet - INFO - Epoch [12][5350/7330] lr: 1.000e-06, eta: 0:21:43, time: 0.714, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0343, loss_cls: 0.1433, acc: 94.5627, loss_bbox: 0.1951, loss_mask: 0.2090, loss: 0.5943 2024-06-01 14:40:40,102 - mmdet - INFO - Epoch [12][5400/7330] lr: 1.000e-06, eta: 0:21:10, time: 0.645, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0343, loss_cls: 0.1407, acc: 94.5730, loss_bbox: 0.1972, loss_mask: 0.2090, loss: 0.5941 2024-06-01 14:41:17,101 - mmdet - INFO - Epoch [12][5450/7330] lr: 1.000e-06, eta: 0:20:37, time: 0.740, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0355, loss_cls: 0.1475, acc: 94.2949, loss_bbox: 0.2056, loss_mask: 0.2116, loss: 0.6122 2024-06-01 14:41:52,231 - mmdet - INFO - Epoch [12][5500/7330] lr: 1.000e-06, eta: 0:20:05, time: 0.703, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0385, loss_cls: 0.1450, acc: 94.4475, loss_bbox: 0.1967, loss_mask: 0.2157, loss: 0.6094 2024-06-01 14:42:24,469 - mmdet - INFO - Epoch [12][5550/7330] lr: 1.000e-06, eta: 0:19:32, time: 0.645, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0326, loss_cls: 0.1394, acc: 94.6492, loss_bbox: 0.1931, loss_mask: 0.2094, loss: 0.5864 2024-06-01 14:42:56,931 - mmdet - INFO - Epoch [12][5600/7330] lr: 1.000e-06, eta: 0:18:59, time: 0.649, data_time: 0.027, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0362, loss_cls: 0.1461, acc: 94.3130, loss_bbox: 0.2044, loss_mask: 0.2130, loss: 0.6129 2024-06-01 14:43:29,191 - mmdet - INFO - Epoch [12][5650/7330] lr: 1.000e-06, eta: 0:18:26, time: 0.645, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0361, loss_cls: 0.1493, acc: 94.2114, loss_bbox: 0.2047, loss_mask: 0.2160, loss: 0.6184 2024-06-01 14:44:01,580 - mmdet - INFO - Epoch [12][5700/7330] lr: 1.000e-06, eta: 0:17:53, time: 0.648, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0367, loss_cls: 0.1484, acc: 94.2871, loss_bbox: 0.2055, loss_mask: 0.2142, loss: 0.6169 2024-06-01 14:44:33,725 - mmdet - INFO - Epoch [12][5750/7330] lr: 1.000e-06, eta: 0:17:20, time: 0.643, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0352, loss_cls: 0.1444, acc: 94.4800, loss_bbox: 0.1994, loss_mask: 0.2097, loss: 0.6016 2024-06-01 14:45:06,119 - mmdet - INFO - Epoch [12][5800/7330] lr: 1.000e-06, eta: 0:16:47, time: 0.648, data_time: 0.037, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0399, loss_cls: 0.1538, acc: 94.0020, loss_bbox: 0.2114, loss_mask: 0.2178, loss: 0.6367 2024-06-01 14:45:38,266 - mmdet - INFO - Epoch [12][5850/7330] lr: 1.000e-06, eta: 0:16:14, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0360, loss_cls: 0.1464, acc: 94.3748, loss_bbox: 0.2005, loss_mask: 0.2139, loss: 0.6096 2024-06-01 14:46:10,346 - mmdet - INFO - Epoch [12][5900/7330] lr: 1.000e-06, eta: 0:15:41, time: 0.642, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0333, loss_cls: 0.1348, acc: 94.8220, loss_bbox: 0.1892, loss_mask: 0.2066, loss: 0.5746 2024-06-01 14:46:42,432 - mmdet - INFO - Epoch [12][5950/7330] lr: 1.000e-06, eta: 0:15:08, time: 0.642, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0326, loss_cls: 0.1404, acc: 94.6165, loss_bbox: 0.1931, loss_mask: 0.2084, loss: 0.5854 2024-06-01 14:47:14,641 - mmdet - INFO - Epoch [12][6000/7330] lr: 1.000e-06, eta: 0:14:35, time: 0.644, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0365, loss_cls: 0.1437, acc: 94.4233, loss_bbox: 0.2021, loss_mask: 0.2142, loss: 0.6095 2024-06-01 14:47:49,342 - mmdet - INFO - Epoch [12][6050/7330] lr: 1.000e-06, eta: 0:14:02, time: 0.694, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0361, loss_cls: 0.1454, acc: 94.4226, loss_bbox: 0.2024, loss_mask: 0.2123, loss: 0.6091 2024-06-01 14:48:24,373 - mmdet - INFO - Epoch [12][6100/7330] lr: 1.000e-06, eta: 0:13:29, time: 0.701, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0356, loss_cls: 0.1469, acc: 94.3191, loss_bbox: 0.2050, loss_mask: 0.2160, loss: 0.6167 2024-06-01 14:48:58,864 - mmdet - INFO - Epoch [12][6150/7330] lr: 1.000e-06, eta: 0:12:57, time: 0.690, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0376, loss_cls: 0.1434, acc: 94.4585, loss_bbox: 0.2003, loss_mask: 0.2149, loss: 0.6097 2024-06-01 14:49:31,091 - mmdet - INFO - Epoch [12][6200/7330] lr: 1.000e-06, eta: 0:12:24, time: 0.645, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0336, loss_cls: 0.1451, acc: 94.4517, loss_bbox: 0.1983, loss_mask: 0.2137, loss: 0.6029 2024-06-01 14:50:03,431 - mmdet - INFO - Epoch [12][6250/7330] lr: 1.000e-06, eta: 0:11:51, time: 0.647, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0347, loss_cls: 0.1419, acc: 94.5374, loss_bbox: 0.1947, loss_mask: 0.2104, loss: 0.5928 2024-06-01 14:50:38,015 - mmdet - INFO - Epoch [12][6300/7330] lr: 1.000e-06, eta: 0:11:18, time: 0.692, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0360, loss_cls: 0.1453, acc: 94.4255, loss_bbox: 0.1976, loss_mask: 0.2110, loss: 0.6022 2024-06-01 14:51:10,315 - mmdet - INFO - Epoch [12][6350/7330] lr: 1.000e-06, eta: 0:10:45, time: 0.646, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0359, loss_cls: 0.1496, acc: 94.2690, loss_bbox: 0.2035, loss_mask: 0.2150, loss: 0.6172 2024-06-01 14:51:44,628 - mmdet - INFO - Epoch [12][6400/7330] lr: 1.000e-06, eta: 0:10:12, time: 0.686, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0349, loss_cls: 0.1443, acc: 94.3967, loss_bbox: 0.1992, loss_mask: 0.2118, loss: 0.6028 2024-06-01 14:52:16,920 - mmdet - INFO - Epoch [12][6450/7330] lr: 1.000e-06, eta: 0:09:39, time: 0.646, data_time: 0.030, memory: 13277, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0335, loss_cls: 0.1397, acc: 94.6504, loss_bbox: 0.1963, loss_mask: 0.2137, loss: 0.5964 2024-06-01 14:52:54,007 - mmdet - INFO - Epoch [12][6500/7330] lr: 1.000e-06, eta: 0:09:06, time: 0.742, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0374, loss_cls: 0.1536, acc: 94.0522, loss_bbox: 0.2055, loss_mask: 0.2169, loss: 0.6280 2024-06-01 14:53:28,316 - mmdet - INFO - Epoch [12][6550/7330] lr: 1.000e-06, eta: 0:08:33, time: 0.686, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0354, loss_cls: 0.1407, acc: 94.6294, loss_bbox: 0.1980, loss_mask: 0.2137, loss: 0.5997 2024-06-01 14:54:00,749 - mmdet - INFO - Epoch [12][6600/7330] lr: 1.000e-06, eta: 0:08:00, time: 0.649, data_time: 0.044, memory: 13277, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0362, loss_cls: 0.1449, acc: 94.4695, loss_bbox: 0.2024, loss_mask: 0.2150, loss: 0.6112 2024-06-01 14:54:33,055 - mmdet - INFO - Epoch [12][6650/7330] lr: 1.000e-06, eta: 0:07:27, time: 0.646, data_time: 0.028, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0362, loss_cls: 0.1415, acc: 94.5200, loss_bbox: 0.1974, loss_mask: 0.2083, loss: 0.5957 2024-06-01 14:55:05,503 - mmdet - INFO - Epoch [12][6700/7330] lr: 1.000e-06, eta: 0:06:54, time: 0.649, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0348, loss_cls: 0.1483, acc: 94.3606, loss_bbox: 0.2033, loss_mask: 0.2159, loss: 0.6160 2024-06-01 14:55:37,928 - mmdet - INFO - Epoch [12][6750/7330] lr: 1.000e-06, eta: 0:06:21, time: 0.648, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0359, loss_cls: 0.1442, acc: 94.4460, loss_bbox: 0.1952, loss_mask: 0.2113, loss: 0.6002 2024-06-01 14:56:10,282 - mmdet - INFO - Epoch [12][6800/7330] lr: 1.000e-06, eta: 0:05:49, time: 0.647, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0373, loss_cls: 0.1511, acc: 94.1006, loss_bbox: 0.2112, loss_mask: 0.2229, loss: 0.6363 2024-06-01 14:56:42,639 - mmdet - INFO - Epoch [12][6850/7330] lr: 1.000e-06, eta: 0:05:16, time: 0.647, data_time: 0.042, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0369, loss_cls: 0.1477, acc: 94.3513, loss_bbox: 0.2029, loss_mask: 0.2118, loss: 0.6117 2024-06-01 14:57:14,780 - mmdet - INFO - Epoch [12][6900/7330] lr: 1.000e-06, eta: 0:04:43, time: 0.643, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0353, loss_cls: 0.1412, acc: 94.6121, loss_bbox: 0.1941, loss_mask: 0.2091, loss: 0.5918 2024-06-01 14:57:47,051 - mmdet - INFO - Epoch [12][6950/7330] lr: 1.000e-06, eta: 0:04:10, time: 0.645, data_time: 0.031, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0353, loss_cls: 0.1401, acc: 94.5903, loss_bbox: 0.1970, loss_mask: 0.2100, loss: 0.5946 2024-06-01 14:58:19,335 - mmdet - INFO - Epoch [12][7000/7330] lr: 1.000e-06, eta: 0:03:37, time: 0.646, data_time: 0.035, memory: 13277, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0357, loss_cls: 0.1457, acc: 94.3816, loss_bbox: 0.2021, loss_mask: 0.2124, loss: 0.6082 2024-06-01 14:58:51,377 - mmdet - INFO - Epoch [12][7050/7330] lr: 1.000e-06, eta: 0:03:04, time: 0.641, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0345, loss_cls: 0.1389, acc: 94.6379, loss_bbox: 0.1922, loss_mask: 0.2121, loss: 0.5908 2024-06-01 14:59:25,763 - mmdet - INFO - Epoch [12][7100/7330] lr: 1.000e-06, eta: 0:02:31, time: 0.688, data_time: 0.034, memory: 13277, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0354, loss_cls: 0.1387, acc: 94.6577, loss_bbox: 0.1958, loss_mask: 0.2080, loss: 0.5889 2024-06-01 15:00:00,918 - mmdet - INFO - Epoch [12][7150/7330] lr: 1.000e-06, eta: 0:01:58, time: 0.703, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0351, loss_cls: 0.1355, acc: 94.7903, loss_bbox: 0.1911, loss_mask: 0.2099, loss: 0.5840 2024-06-01 15:00:35,872 - mmdet - INFO - Epoch [12][7200/7330] lr: 1.000e-06, eta: 0:01:25, time: 0.699, data_time: 0.032, memory: 13277, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0337, loss_cls: 0.1379, acc: 94.6814, loss_bbox: 0.1914, loss_mask: 0.2066, loss: 0.5831 2024-06-01 15:01:08,189 - mmdet - INFO - Epoch [12][7250/7330] lr: 1.000e-06, eta: 0:00:52, time: 0.646, data_time: 0.029, memory: 13277, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0356, loss_cls: 0.1401, acc: 94.5210, loss_bbox: 0.1950, loss_mask: 0.2113, loss: 0.5945 2024-06-01 15:01:40,319 - mmdet - INFO - Epoch [12][7300/7330] lr: 1.000e-06, eta: 0:00:19, time: 0.642, data_time: 0.033, memory: 13277, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0349, loss_cls: 0.1383, acc: 94.7085, loss_bbox: 0.1940, loss_mask: 0.2096, loss: 0.5884 2024-06-01 15:02:02,396 - mmdet - INFO - Saving checkpoint at 12 epochs 2024-06-01 15:03:41,625 - mmdet - INFO - Evaluating bbox... 2024-06-01 15:04:03,011 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.483 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.711 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.526 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.293 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.525 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.660 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.592 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.592 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.592 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.394 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.636 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.770 2024-06-01 15:04:03,012 - mmdet - INFO - Evaluating segm... 2024-06-01 15:04:28,131 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.427 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.673 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.453 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.206 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.462 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.649 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.529 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.529 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.529 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.316 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.576 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.725 2024-06-01 15:04:28,509 - mmdet - INFO - Exp name: mask_rcnn_deit_sbl_1344_896_448_fpn_1x_coco_bs16.py 2024-06-01 15:04:28,511 - mmdet - INFO - Epoch(val) [12][625] bbox_mAP: 0.4830, bbox_mAP_50: 0.7110, bbox_mAP_75: 0.5260, bbox_mAP_s: 0.2930, bbox_mAP_m: 0.5250, bbox_mAP_l: 0.6600, bbox_mAP_copypaste: 0.483 0.711 0.526 0.293 0.525 0.660, segm_mAP: 0.4270, segm_mAP_50: 0.6730, segm_mAP_75: 0.4530, segm_mAP_s: 0.2060, segm_mAP_m: 0.4620, segm_mAP_l: 0.6490, segm_mAP_copypaste: 0.427 0.673 0.453 0.206 0.462 0.649