2024-05-29 10:13:45,518 - mmdet - INFO - Environment info: ------------------------------------------------------------ sys.platform: linux Python: 3.9.19 (main, May 6 2024, 19:43:03) [GCC 11.2.0] CUDA available: True GPU 0,1,2,3,4,5,6,7: NVIDIA A100-SXM4-80GB CUDA_HOME: /mnt/petrelfs/share/cuda-11.7/ NVCC: Cuda compilation tools, release 11.7, V11.7.99 GCC: gcc (GCC) 7.3.0 PyTorch: 1.12.0+cu113 PyTorch compiling details: PyTorch built with: - GCC 9.3 - C++ Version: 201402 - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v2.6.0 (Git Hash 52b5f107dd9cf10910aaa19cb47f3abf9b349815) - OpenMP 201511 (a.k.a. OpenMP 4.5) - LAPACK is enabled (usually provided by MKL) - NNPACK is enabled - CPU capability usage: AVX2 - CUDA Runtime 11.3 - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86 - CuDNN 8.3.2 (built against CUDA 11.5) - Magma 2.5.2 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.3, CUDNN_VERSION=8.3.2, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.12.0, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=OFF, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, TorchVision: 0.13.0+cu113 OpenCV: 4.9.0 MMCV: 1.7.0 MMCV Compiler: GCC 7.3 MMCV CUDA Compiler: 11.7 MMDetection: 2.25.3+da84357 ------------------------------------------------------------ 2024-05-29 10:13:46,950 - mmdet - INFO - Distributed training: True 2024-05-29 10:13:48,398 - mmdet - INFO - Config: model = dict( type='MaskRCNN', backbone=dict( type='PIIPThreeBranch', n_points=4, deform_num_heads=16, cffn_ratio=0.25, deform_ratio=0.5, with_cffn=True, interact_attn_type='deform', interaction_drop_path_rate=0.4, branch1=dict( real_size=672, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=768, num_heads=12, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.15, init_scale=1.0, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_3_base_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch2=dict( real_size=1120, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=384, num_heads=6, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.05, init_scale=1.0, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_3_small_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch3=dict( real_size=1568, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=192, num_heads=3, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.05, init_scale=1.0, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_tiny_patch16_224-a1311bcf.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 14], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True))), neck=dict( type='FPN', in_channels=[768, 768, 768, 768], out_channels=256, num_outs=5), rpn_head=dict( type='RPNHead', in_channels=256, feat_channels=256, anchor_generator=dict( type='AnchorGenerator', scales=[8], ratios=[0.5, 1.0, 2.0], strides=[4, 8, 16, 32, 64]), bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[1.0, 1.0, 1.0, 1.0]), loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), roi_head=dict( type='StandardRoIHead', bbox_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=7, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), bbox_head=dict( type='Shared2FCBBoxHead', in_channels=256, fc_out_channels=1024, roi_feat_size=7, num_classes=80, bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[0.1, 0.1, 0.2, 0.2]), reg_class_agnostic=False, loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), mask_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=14, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), mask_head=dict( type='FCNMaskHead', num_convs=4, in_channels=256, conv_out_channels=256, num_classes=80, loss_mask=dict( type='CrossEntropyLoss', use_mask=True, loss_weight=1.0))), train_cfg=dict( rpn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.7, neg_iou_thr=0.3, min_pos_iou=0.3, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=256, pos_fraction=0.5, neg_pos_ub=-1, add_gt_as_proposals=False), allowed_border=-1, pos_weight=-1, debug=False), rpn_proposal=dict( nms_pre=2000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.5, neg_iou_thr=0.5, min_pos_iou=0.5, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=512, pos_fraction=0.25, neg_pos_ub=-1, add_gt_as_proposals=True), mask_size=28, pos_weight=-1, debug=False)), test_cfg=dict( rpn=dict( nms_pre=1000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( score_thr=0.05, nms=dict(type='nms', iou_threshold=0.5), max_per_img=100, mask_thr_binary=0.5))) dataset_type = 'CocoDataset' data_root = 'data/coco/' img_norm_cfg = dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True) train_pipeline = [ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1568, 941), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ] test_pipeline = [ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1568, 941), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ] data = dict( samples_per_gpu=2, workers_per_gpu=2, train=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_train2017.json', img_prefix='data/coco/train2017/', pipeline=[ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1568, 941), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='DefaultFormatBundle'), dict( type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ]), val=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1568, 941), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ]), test=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1568, 941), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ])) evaluation = dict(metric=['bbox', 'segm'], interval=1, save_best=None) optimizer = dict( type='AdamW', lr=0.0001, betas=(0.9, 0.999), weight_decay=0.05, constructor='CustomLayerDecayOptimizerConstructorMMDet', paramwise_cfg=dict( num_layers=12, layer_decay_rate=0.85, skip_stride=[1, 1])) optimizer_config = dict(grad_clip=None) lr_config = dict( policy='step', warmup='linear', warmup_iters=500, warmup_ratio=0.001, step=[8, 11]) runner = dict(type='EpochBasedRunner', max_epochs=12) checkpoint_config = dict(interval=1, deepspeed=True, max_keep_ckpts=1) log_config = dict(interval=50, hooks=[dict(type='TextLoggerHook')]) custom_hooks = [dict(type='ToBFloat16HookMMDet', priority=49)] dist_params = dict(backend='nccl') log_level = 'INFO' load_from = None resume_from = None workflow = [('train', 1)] opencv_num_threads = 0 mp_start_method = 'fork' auto_scale_lr = dict(enable=False, base_batch_size=16) deepspeed = True deepspeed_config = 'zero_configs/adam_zero1_bf16.json' custom_imports = dict( imports=['mmdet.mmcv_custom'], allow_failed_imports=False) work_dir = './work_dirs/mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16' auto_resume = True gpu_ids = range(0, 8) 2024-05-29 10:13:53,498 - mmdet - INFO - Set random seed to 1088313239, deterministic: False 2024-05-29 10:13:56,627 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-29 10:13:58,870 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-29 10:14:00,985 - mmdet - INFO - _IncompatibleKeys(missing_keys=['blocks.0.gamma_1', 'blocks.0.gamma_2', 'blocks.1.gamma_1', 'blocks.1.gamma_2', 'blocks.2.gamma_1', 'blocks.2.gamma_2', 'blocks.3.gamma_1', 'blocks.3.gamma_2', 'blocks.4.gamma_1', 'blocks.4.gamma_2', 'blocks.5.gamma_1', 'blocks.5.gamma_2', 'blocks.6.gamma_1', 'blocks.6.gamma_2', 'blocks.7.gamma_1', 'blocks.7.gamma_2', 'blocks.8.gamma_1', 'blocks.8.gamma_2', 'blocks.9.gamma_1', 'blocks.9.gamma_2', 'blocks.10.gamma_1', 'blocks.10.gamma_2', 'blocks.11.gamma_1', 'blocks.11.gamma_2'], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-29 10:14:56,568 - mmdet - INFO - initialize FPN with init_cfg {'type': 'Xavier', 'layer': 'Conv2d', 'distribution': 'uniform'} 2024-05-29 10:14:57,081 - mmdet - INFO - initialize RPNHead with init_cfg {'type': 'Normal', 'layer': 'Conv2d', 'std': 0.01} 2024-05-29 10:14:57,140 - mmdet - INFO - initialize Shared2FCBBoxHead with init_cfg [{'type': 'Normal', 'std': 0.01, 'override': {'name': 'fc_cls'}}, {'type': 'Normal', 'std': 0.001, 'override': {'name': 'fc_reg'}}, {'type': 'Xavier', 'distribution': 'uniform', 'override': [{'name': 'shared_fcs'}, {'name': 'cls_fcs'}, {'name': 'reg_fcs'}]}] Name of parameter - Initialization information backbone.w1 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w2 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w3 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.pos_embed - torch.Size([1, 196, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.patch_embed.proj.weight - torch.Size([768, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.patch_embed.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.pos_embed - torch.Size([1, 196, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.patch_embed.proj.weight - torch.Size([384, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.patch_embed.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.pos_embed - torch.Size([1, 196, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.patch_embed.proj.weight - torch.Size([192, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.patch_embed.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.gamma_1 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.gamma_2 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc1.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc2.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.gamma_1 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.gamma_2 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc1.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc2.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.gamma_1 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.gamma_2 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc1.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc2.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.gamma_1 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.gamma_2 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc1.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc2.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.gamma_1 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.gamma_2 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc1.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc2.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.gamma_1 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.gamma_2 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc1.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc2.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.gamma_1 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.gamma_2 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc1.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc2.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.gamma_1 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.gamma_2 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc1.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc2.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.gamma_1 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.gamma_2 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc1.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc2.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.gamma_1 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.gamma_2 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc1.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc2.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.gamma_1 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.gamma_2 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc1.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc2.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.gamma_1 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.gamma_2 - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm1.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.qkv.weight - torch.Size([576, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.qkv.bias - torch.Size([576]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.proj.weight - torch.Size([192, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm2.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc1.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc2.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([96, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([192, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([48, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([48, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([192, 48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([96, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([192, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([48, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([48, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([192, 48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([96, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([192, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([48, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([48, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([192, 48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([96, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([192, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([48, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([48, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([192, 48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([96, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([192, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([48, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([48, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([192, 48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([96, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([192, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([48, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([48, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([192, 48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([96, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([192, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([48, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([48, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([192, 48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([96, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([192, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([48, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([48, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([192, 48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([96, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([192, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([48, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([48, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([192, 48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([96, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([192, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([48, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([48, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([192, 48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([96, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([192, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([48, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([48, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([192, 48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([96, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([192, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([48, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([48, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([192, 48]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.0.weight - torch.Size([768, 768, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.3.weight - torch.Size([768, 768, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.4.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.4.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.0.weight - torch.Size([768, 384, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.3.weight - torch.Size([768, 768, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.4.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.4.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.0.weight - torch.Size([768, 192, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.3.weight - torch.Size([768, 768, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.4.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.4.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.0.weight - torch.Size([768, 768, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.0.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.3.weight - torch.Size([768, 768, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.3.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn2.0.weight - torch.Size([768, 768, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn2.0.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.0.conv.weight - torch.Size([256, 768, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.1.conv.weight - torch.Size([256, 768, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.2.conv.weight - torch.Size([256, 768, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.3.conv.weight - torch.Size([256, 768, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.0.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.1.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.2.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.3.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN rpn_head.rpn_conv.weight - torch.Size([256, 256, 3, 3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_conv.bias - torch.Size([256]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.weight - torch.Size([3, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.bias - torch.Size([3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.weight - torch.Size([12, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.bias - torch.Size([12]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_cls.weight - torch.Size([81, 1024]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_cls.bias - torch.Size([81]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_reg.weight - torch.Size([320, 1024]): NormalInit: mean=0, std=0.001, bias=0 roi_head.bbox_head.fc_reg.bias - torch.Size([320]): NormalInit: mean=0, std=0.001, bias=0 roi_head.bbox_head.shared_fcs.0.weight - torch.Size([1024, 12544]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.0.bias - torch.Size([1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.1.weight - torch.Size([1024, 1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.1.bias - torch.Size([1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.mask_head.convs.0.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.1.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.2.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.3.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.upsample.weight - torch.Size([256, 256, 2, 2]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.upsample.bias - torch.Size([256]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.conv_logits.weight - torch.Size([80, 256, 1, 1]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.conv_logits.bias - torch.Size([80]): Initialized by user-defined `init_weights` in FCNMaskHead 2024-05-29 10:15:13,561 - mmdet - INFO - {'num_layers': 12, 'layer_decay_rate': 0.85, 'skip_stride': [1, 1]} 2024-05-29 10:15:13,561 - mmdet - INFO - Build LayerDecayOptimizerConstructor 0.850000 - 14 2024-05-29 10:15:13,572 - mmdet - INFO - Param groups = { "layer_13_decay": { "param_names": [ "backbone.w1", "backbone.w2", "backbone.w3", "backbone.interactions.0.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.merge_branch1.0.weight", "backbone.merge_branch1.3.weight", "backbone.merge_branch2.0.weight", "backbone.merge_branch2.3.weight", "backbone.merge_branch3.0.weight", "backbone.merge_branch3.3.weight", "backbone.fpn1.0.weight", "backbone.fpn1.3.weight", "backbone.fpn2.0.weight", "neck.lateral_convs.0.conv.weight", "neck.lateral_convs.1.conv.weight", "neck.lateral_convs.2.conv.weight", "neck.lateral_convs.3.conv.weight", "neck.fpn_convs.0.conv.weight", "neck.fpn_convs.1.conv.weight", "neck.fpn_convs.2.conv.weight", "neck.fpn_convs.3.conv.weight", "rpn_head.rpn_conv.weight", "rpn_head.rpn_cls.weight", "rpn_head.rpn_reg.weight", "roi_head.bbox_head.fc_cls.weight", "roi_head.bbox_head.fc_reg.weight", "roi_head.bbox_head.shared_fcs.0.weight", "roi_head.bbox_head.shared_fcs.1.weight", "roi_head.mask_head.convs.0.conv.weight", "roi_head.mask_head.convs.1.conv.weight", "roi_head.mask_head.convs.2.conv.weight", "roi_head.mask_head.convs.3.conv.weight", "roi_head.mask_head.upsample.weight", "roi_head.mask_head.conv_logits.weight" ], "lr_scale": 1.0, "lr": 0.0001, "weight_decay": 0.05 }, "layer_0_decay": { "param_names": [ "backbone.branch1.pos_embed", "backbone.branch1.patch_embed.proj.weight", "backbone.branch2.pos_embed", "backbone.branch2.patch_embed.proj.weight", "backbone.branch3.pos_embed", "backbone.branch3.patch_embed.proj.weight" ], "lr_scale": 0.12090549356574626, "lr": 1.2090549356574626e-05, "weight_decay": 0.05 }, "layer_0_no_decay": { "param_names": [ "backbone.branch1.patch_embed.proj.bias", "backbone.branch2.patch_embed.proj.bias", "backbone.branch3.patch_embed.proj.bias" ], "lr_scale": 0.12090549356574626, "lr": 1.2090549356574626e-05, "weight_decay": 0.0 }, "layer_1_no_decay": { "param_names": [ "backbone.branch1.blocks.0.gamma_1", "backbone.branch1.blocks.0.gamma_2", "backbone.branch1.blocks.0.norm1.weight", "backbone.branch1.blocks.0.norm1.bias", "backbone.branch1.blocks.0.attn.qkv.bias", "backbone.branch1.blocks.0.attn.proj.bias", "backbone.branch1.blocks.0.norm2.weight", "backbone.branch1.blocks.0.norm2.bias", "backbone.branch1.blocks.0.mlp.fc1.bias", "backbone.branch1.blocks.0.mlp.fc2.bias", "backbone.branch2.blocks.0.gamma_1", "backbone.branch2.blocks.0.gamma_2", "backbone.branch2.blocks.0.norm1.weight", "backbone.branch2.blocks.0.norm1.bias", "backbone.branch2.blocks.0.attn.qkv.bias", "backbone.branch2.blocks.0.attn.proj.bias", "backbone.branch2.blocks.0.norm2.weight", "backbone.branch2.blocks.0.norm2.bias", "backbone.branch2.blocks.0.mlp.fc1.bias", "backbone.branch2.blocks.0.mlp.fc2.bias", "backbone.branch3.blocks.0.gamma_1", "backbone.branch3.blocks.0.gamma_2", "backbone.branch3.blocks.0.norm1.weight", "backbone.branch3.blocks.0.norm1.bias", "backbone.branch3.blocks.0.attn.qkv.bias", "backbone.branch3.blocks.0.attn.proj.bias", "backbone.branch3.blocks.0.norm2.weight", "backbone.branch3.blocks.0.norm2.bias", "backbone.branch3.blocks.0.mlp.fc1.bias", "backbone.branch3.blocks.0.mlp.fc2.bias" ], "lr_scale": 0.14224175713617207, "lr": 1.4224175713617208e-05, "weight_decay": 0.0 }, "layer_1_decay": { "param_names": [ "backbone.branch1.blocks.0.attn.qkv.weight", "backbone.branch1.blocks.0.attn.proj.weight", "backbone.branch1.blocks.0.mlp.fc1.weight", "backbone.branch1.blocks.0.mlp.fc2.weight", "backbone.branch2.blocks.0.attn.qkv.weight", "backbone.branch2.blocks.0.attn.proj.weight", "backbone.branch2.blocks.0.mlp.fc1.weight", "backbone.branch2.blocks.0.mlp.fc2.weight", "backbone.branch3.blocks.0.attn.qkv.weight", "backbone.branch3.blocks.0.attn.proj.weight", "backbone.branch3.blocks.0.mlp.fc1.weight", "backbone.branch3.blocks.0.mlp.fc2.weight" ], "lr_scale": 0.14224175713617207, "lr": 1.4224175713617208e-05, "weight_decay": 0.05 }, "layer_2_no_decay": { "param_names": [ "backbone.branch1.blocks.1.gamma_1", "backbone.branch1.blocks.1.gamma_2", "backbone.branch1.blocks.1.norm1.weight", "backbone.branch1.blocks.1.norm1.bias", "backbone.branch1.blocks.1.attn.qkv.bias", "backbone.branch1.blocks.1.attn.proj.bias", "backbone.branch1.blocks.1.norm2.weight", "backbone.branch1.blocks.1.norm2.bias", "backbone.branch1.blocks.1.mlp.fc1.bias", "backbone.branch1.blocks.1.mlp.fc2.bias", "backbone.branch2.blocks.1.gamma_1", "backbone.branch2.blocks.1.gamma_2", "backbone.branch2.blocks.1.norm1.weight", "backbone.branch2.blocks.1.norm1.bias", "backbone.branch2.blocks.1.attn.qkv.bias", "backbone.branch2.blocks.1.attn.proj.bias", "backbone.branch2.blocks.1.norm2.weight", "backbone.branch2.blocks.1.norm2.bias", "backbone.branch2.blocks.1.mlp.fc1.bias", "backbone.branch2.blocks.1.mlp.fc2.bias", "backbone.branch3.blocks.1.gamma_1", "backbone.branch3.blocks.1.gamma_2", "backbone.branch3.blocks.1.norm1.weight", "backbone.branch3.blocks.1.norm1.bias", "backbone.branch3.blocks.1.attn.qkv.bias", "backbone.branch3.blocks.1.attn.proj.bias", "backbone.branch3.blocks.1.norm2.weight", "backbone.branch3.blocks.1.norm2.bias", "backbone.branch3.blocks.1.mlp.fc1.bias", "backbone.branch3.blocks.1.mlp.fc2.bias" ], "lr_scale": 0.1673432436896142, "lr": 1.673432436896142e-05, "weight_decay": 0.0 }, "layer_2_decay": { "param_names": [ "backbone.branch1.blocks.1.attn.qkv.weight", "backbone.branch1.blocks.1.attn.proj.weight", "backbone.branch1.blocks.1.mlp.fc1.weight", "backbone.branch1.blocks.1.mlp.fc2.weight", "backbone.branch2.blocks.1.attn.qkv.weight", "backbone.branch2.blocks.1.attn.proj.weight", "backbone.branch2.blocks.1.mlp.fc1.weight", "backbone.branch2.blocks.1.mlp.fc2.weight", "backbone.branch3.blocks.1.attn.qkv.weight", "backbone.branch3.blocks.1.attn.proj.weight", "backbone.branch3.blocks.1.mlp.fc1.weight", "backbone.branch3.blocks.1.mlp.fc2.weight" ], "lr_scale": 0.1673432436896142, "lr": 1.673432436896142e-05, "weight_decay": 0.05 }, "layer_3_no_decay": { "param_names": [ "backbone.branch1.blocks.2.gamma_1", "backbone.branch1.blocks.2.gamma_2", "backbone.branch1.blocks.2.norm1.weight", "backbone.branch1.blocks.2.norm1.bias", "backbone.branch1.blocks.2.attn.qkv.bias", "backbone.branch1.blocks.2.attn.proj.bias", "backbone.branch1.blocks.2.norm2.weight", "backbone.branch1.blocks.2.norm2.bias", "backbone.branch1.blocks.2.mlp.fc1.bias", "backbone.branch1.blocks.2.mlp.fc2.bias", "backbone.branch2.blocks.2.gamma_1", "backbone.branch2.blocks.2.gamma_2", "backbone.branch2.blocks.2.norm1.weight", "backbone.branch2.blocks.2.norm1.bias", "backbone.branch2.blocks.2.attn.qkv.bias", "backbone.branch2.blocks.2.attn.proj.bias", "backbone.branch2.blocks.2.norm2.weight", "backbone.branch2.blocks.2.norm2.bias", "backbone.branch2.blocks.2.mlp.fc1.bias", "backbone.branch2.blocks.2.mlp.fc2.bias", "backbone.branch3.blocks.2.gamma_1", "backbone.branch3.blocks.2.gamma_2", "backbone.branch3.blocks.2.norm1.weight", "backbone.branch3.blocks.2.norm1.bias", "backbone.branch3.blocks.2.attn.qkv.bias", "backbone.branch3.blocks.2.attn.proj.bias", "backbone.branch3.blocks.2.norm2.weight", "backbone.branch3.blocks.2.norm2.bias", "backbone.branch3.blocks.2.mlp.fc1.bias", "backbone.branch3.blocks.2.mlp.fc2.bias" ], "lr_scale": 0.1968744043407226, "lr": 1.968744043407226e-05, "weight_decay": 0.0 }, "layer_3_decay": { "param_names": [ "backbone.branch1.blocks.2.attn.qkv.weight", "backbone.branch1.blocks.2.attn.proj.weight", "backbone.branch1.blocks.2.mlp.fc1.weight", "backbone.branch1.blocks.2.mlp.fc2.weight", "backbone.branch2.blocks.2.attn.qkv.weight", "backbone.branch2.blocks.2.attn.proj.weight", "backbone.branch2.blocks.2.mlp.fc1.weight", "backbone.branch2.blocks.2.mlp.fc2.weight", "backbone.branch3.blocks.2.attn.qkv.weight", "backbone.branch3.blocks.2.attn.proj.weight", "backbone.branch3.blocks.2.mlp.fc1.weight", "backbone.branch3.blocks.2.mlp.fc2.weight" ], "lr_scale": 0.1968744043407226, "lr": 1.968744043407226e-05, "weight_decay": 0.05 }, "layer_4_no_decay": { "param_names": [ "backbone.branch1.blocks.3.gamma_1", "backbone.branch1.blocks.3.gamma_2", "backbone.branch1.blocks.3.norm1.weight", "backbone.branch1.blocks.3.norm1.bias", "backbone.branch1.blocks.3.attn.qkv.bias", "backbone.branch1.blocks.3.attn.proj.bias", "backbone.branch1.blocks.3.norm2.weight", "backbone.branch1.blocks.3.norm2.bias", "backbone.branch1.blocks.3.mlp.fc1.bias", "backbone.branch1.blocks.3.mlp.fc2.bias", "backbone.branch2.blocks.3.gamma_1", "backbone.branch2.blocks.3.gamma_2", "backbone.branch2.blocks.3.norm1.weight", "backbone.branch2.blocks.3.norm1.bias", "backbone.branch2.blocks.3.attn.qkv.bias", "backbone.branch2.blocks.3.attn.proj.bias", "backbone.branch2.blocks.3.norm2.weight", "backbone.branch2.blocks.3.norm2.bias", "backbone.branch2.blocks.3.mlp.fc1.bias", "backbone.branch2.blocks.3.mlp.fc2.bias", "backbone.branch3.blocks.3.gamma_1", "backbone.branch3.blocks.3.gamma_2", "backbone.branch3.blocks.3.norm1.weight", "backbone.branch3.blocks.3.norm1.bias", "backbone.branch3.blocks.3.attn.qkv.bias", "backbone.branch3.blocks.3.attn.proj.bias", "backbone.branch3.blocks.3.norm2.weight", "backbone.branch3.blocks.3.norm2.bias", "backbone.branch3.blocks.3.mlp.fc1.bias", "backbone.branch3.blocks.3.mlp.fc2.bias" ], "lr_scale": 0.23161694628320306, "lr": 2.3161694628320308e-05, "weight_decay": 0.0 }, "layer_4_decay": { "param_names": [ "backbone.branch1.blocks.3.attn.qkv.weight", "backbone.branch1.blocks.3.attn.proj.weight", "backbone.branch1.blocks.3.mlp.fc1.weight", "backbone.branch1.blocks.3.mlp.fc2.weight", "backbone.branch2.blocks.3.attn.qkv.weight", "backbone.branch2.blocks.3.attn.proj.weight", "backbone.branch2.blocks.3.mlp.fc1.weight", "backbone.branch2.blocks.3.mlp.fc2.weight", "backbone.branch3.blocks.3.attn.qkv.weight", "backbone.branch3.blocks.3.attn.proj.weight", "backbone.branch3.blocks.3.mlp.fc1.weight", "backbone.branch3.blocks.3.mlp.fc2.weight" ], "lr_scale": 0.23161694628320306, "lr": 2.3161694628320308e-05, "weight_decay": 0.05 }, "layer_5_no_decay": { "param_names": [ "backbone.branch1.blocks.4.gamma_1", "backbone.branch1.blocks.4.gamma_2", "backbone.branch1.blocks.4.norm1.weight", "backbone.branch1.blocks.4.norm1.bias", "backbone.branch1.blocks.4.attn.qkv.bias", "backbone.branch1.blocks.4.attn.proj.bias", "backbone.branch1.blocks.4.norm2.weight", "backbone.branch1.blocks.4.norm2.bias", "backbone.branch1.blocks.4.mlp.fc1.bias", "backbone.branch1.blocks.4.mlp.fc2.bias", "backbone.branch2.blocks.4.gamma_1", "backbone.branch2.blocks.4.gamma_2", "backbone.branch2.blocks.4.norm1.weight", "backbone.branch2.blocks.4.norm1.bias", "backbone.branch2.blocks.4.attn.qkv.bias", "backbone.branch2.blocks.4.attn.proj.bias", "backbone.branch2.blocks.4.norm2.weight", "backbone.branch2.blocks.4.norm2.bias", "backbone.branch2.blocks.4.mlp.fc1.bias", "backbone.branch2.blocks.4.mlp.fc2.bias", "backbone.branch3.blocks.4.gamma_1", "backbone.branch3.blocks.4.gamma_2", "backbone.branch3.blocks.4.norm1.weight", "backbone.branch3.blocks.4.norm1.bias", "backbone.branch3.blocks.4.attn.qkv.bias", "backbone.branch3.blocks.4.attn.proj.bias", "backbone.branch3.blocks.4.norm2.weight", "backbone.branch3.blocks.4.norm2.bias", "backbone.branch3.blocks.4.mlp.fc1.bias", "backbone.branch3.blocks.4.mlp.fc2.bias" ], "lr_scale": 0.27249052503906246, "lr": 2.7249052503906248e-05, "weight_decay": 0.0 }, "layer_5_decay": { "param_names": [ "backbone.branch1.blocks.4.attn.qkv.weight", "backbone.branch1.blocks.4.attn.proj.weight", "backbone.branch1.blocks.4.mlp.fc1.weight", "backbone.branch1.blocks.4.mlp.fc2.weight", "backbone.branch2.blocks.4.attn.qkv.weight", "backbone.branch2.blocks.4.attn.proj.weight", "backbone.branch2.blocks.4.mlp.fc1.weight", "backbone.branch2.blocks.4.mlp.fc2.weight", "backbone.branch3.blocks.4.attn.qkv.weight", "backbone.branch3.blocks.4.attn.proj.weight", "backbone.branch3.blocks.4.mlp.fc1.weight", "backbone.branch3.blocks.4.mlp.fc2.weight" ], "lr_scale": 0.27249052503906246, "lr": 2.7249052503906248e-05, "weight_decay": 0.05 }, "layer_6_no_decay": { "param_names": [ "backbone.branch1.blocks.5.gamma_1", "backbone.branch1.blocks.5.gamma_2", "backbone.branch1.blocks.5.norm1.weight", "backbone.branch1.blocks.5.norm1.bias", "backbone.branch1.blocks.5.attn.qkv.bias", "backbone.branch1.blocks.5.attn.proj.bias", "backbone.branch1.blocks.5.norm2.weight", "backbone.branch1.blocks.5.norm2.bias", "backbone.branch1.blocks.5.mlp.fc1.bias", "backbone.branch1.blocks.5.mlp.fc2.bias", "backbone.branch2.blocks.5.gamma_1", "backbone.branch2.blocks.5.gamma_2", "backbone.branch2.blocks.5.norm1.weight", "backbone.branch2.blocks.5.norm1.bias", "backbone.branch2.blocks.5.attn.qkv.bias", "backbone.branch2.blocks.5.attn.proj.bias", "backbone.branch2.blocks.5.norm2.weight", "backbone.branch2.blocks.5.norm2.bias", "backbone.branch2.blocks.5.mlp.fc1.bias", "backbone.branch2.blocks.5.mlp.fc2.bias", "backbone.branch3.blocks.5.gamma_1", "backbone.branch3.blocks.5.gamma_2", "backbone.branch3.blocks.5.norm1.weight", "backbone.branch3.blocks.5.norm1.bias", "backbone.branch3.blocks.5.attn.qkv.bias", "backbone.branch3.blocks.5.attn.proj.bias", "backbone.branch3.blocks.5.norm2.weight", "backbone.branch3.blocks.5.norm2.bias", "backbone.branch3.blocks.5.mlp.fc1.bias", "backbone.branch3.blocks.5.mlp.fc2.bias" ], "lr_scale": 0.3205770882812499, "lr": 3.2057708828124995e-05, "weight_decay": 0.0 }, "layer_6_decay": { "param_names": [ "backbone.branch1.blocks.5.attn.qkv.weight", "backbone.branch1.blocks.5.attn.proj.weight", "backbone.branch1.blocks.5.mlp.fc1.weight", "backbone.branch1.blocks.5.mlp.fc2.weight", "backbone.branch2.blocks.5.attn.qkv.weight", "backbone.branch2.blocks.5.attn.proj.weight", "backbone.branch2.blocks.5.mlp.fc1.weight", "backbone.branch2.blocks.5.mlp.fc2.weight", "backbone.branch3.blocks.5.attn.qkv.weight", "backbone.branch3.blocks.5.attn.proj.weight", "backbone.branch3.blocks.5.mlp.fc1.weight", "backbone.branch3.blocks.5.mlp.fc2.weight" ], "lr_scale": 0.3205770882812499, "lr": 3.2057708828124995e-05, "weight_decay": 0.05 }, "layer_7_no_decay": { "param_names": [ "backbone.branch1.blocks.6.gamma_1", "backbone.branch1.blocks.6.gamma_2", "backbone.branch1.blocks.6.norm1.weight", "backbone.branch1.blocks.6.norm1.bias", "backbone.branch1.blocks.6.attn.qkv.bias", "backbone.branch1.blocks.6.attn.proj.bias", "backbone.branch1.blocks.6.norm2.weight", "backbone.branch1.blocks.6.norm2.bias", "backbone.branch1.blocks.6.mlp.fc1.bias", "backbone.branch1.blocks.6.mlp.fc2.bias", "backbone.branch2.blocks.6.gamma_1", "backbone.branch2.blocks.6.gamma_2", "backbone.branch2.blocks.6.norm1.weight", "backbone.branch2.blocks.6.norm1.bias", "backbone.branch2.blocks.6.attn.qkv.bias", "backbone.branch2.blocks.6.attn.proj.bias", "backbone.branch2.blocks.6.norm2.weight", "backbone.branch2.blocks.6.norm2.bias", "backbone.branch2.blocks.6.mlp.fc1.bias", "backbone.branch2.blocks.6.mlp.fc2.bias", "backbone.branch3.blocks.6.gamma_1", "backbone.branch3.blocks.6.gamma_2", "backbone.branch3.blocks.6.norm1.weight", "backbone.branch3.blocks.6.norm1.bias", "backbone.branch3.blocks.6.attn.qkv.bias", "backbone.branch3.blocks.6.attn.proj.bias", "backbone.branch3.blocks.6.norm2.weight", "backbone.branch3.blocks.6.norm2.bias", "backbone.branch3.blocks.6.mlp.fc1.bias", "backbone.branch3.blocks.6.mlp.fc2.bias" ], "lr_scale": 0.37714951562499993, "lr": 3.77149515625e-05, "weight_decay": 0.0 }, "layer_7_decay": { "param_names": [ "backbone.branch1.blocks.6.attn.qkv.weight", "backbone.branch1.blocks.6.attn.proj.weight", "backbone.branch1.blocks.6.mlp.fc1.weight", "backbone.branch1.blocks.6.mlp.fc2.weight", "backbone.branch2.blocks.6.attn.qkv.weight", "backbone.branch2.blocks.6.attn.proj.weight", "backbone.branch2.blocks.6.mlp.fc1.weight", "backbone.branch2.blocks.6.mlp.fc2.weight", "backbone.branch3.blocks.6.attn.qkv.weight", "backbone.branch3.blocks.6.attn.proj.weight", "backbone.branch3.blocks.6.mlp.fc1.weight", "backbone.branch3.blocks.6.mlp.fc2.weight" ], "lr_scale": 0.37714951562499993, "lr": 3.77149515625e-05, "weight_decay": 0.05 }, "layer_8_no_decay": { "param_names": [ "backbone.branch1.blocks.7.gamma_1", "backbone.branch1.blocks.7.gamma_2", "backbone.branch1.blocks.7.norm1.weight", "backbone.branch1.blocks.7.norm1.bias", "backbone.branch1.blocks.7.attn.qkv.bias", "backbone.branch1.blocks.7.attn.proj.bias", "backbone.branch1.blocks.7.norm2.weight", "backbone.branch1.blocks.7.norm2.bias", "backbone.branch1.blocks.7.mlp.fc1.bias", "backbone.branch1.blocks.7.mlp.fc2.bias", "backbone.branch2.blocks.7.gamma_1", "backbone.branch2.blocks.7.gamma_2", "backbone.branch2.blocks.7.norm1.weight", "backbone.branch2.blocks.7.norm1.bias", "backbone.branch2.blocks.7.attn.qkv.bias", "backbone.branch2.blocks.7.attn.proj.bias", "backbone.branch2.blocks.7.norm2.weight", "backbone.branch2.blocks.7.norm2.bias", "backbone.branch2.blocks.7.mlp.fc1.bias", "backbone.branch2.blocks.7.mlp.fc2.bias", "backbone.branch3.blocks.7.gamma_1", "backbone.branch3.blocks.7.gamma_2", "backbone.branch3.blocks.7.norm1.weight", "backbone.branch3.blocks.7.norm1.bias", "backbone.branch3.blocks.7.attn.qkv.bias", "backbone.branch3.blocks.7.attn.proj.bias", "backbone.branch3.blocks.7.norm2.weight", "backbone.branch3.blocks.7.norm2.bias", "backbone.branch3.blocks.7.mlp.fc1.bias", "backbone.branch3.blocks.7.mlp.fc2.bias" ], "lr_scale": 0.44370531249999995, "lr": 4.4370531249999995e-05, "weight_decay": 0.0 }, "layer_8_decay": { "param_names": [ "backbone.branch1.blocks.7.attn.qkv.weight", "backbone.branch1.blocks.7.attn.proj.weight", "backbone.branch1.blocks.7.mlp.fc1.weight", "backbone.branch1.blocks.7.mlp.fc2.weight", "backbone.branch2.blocks.7.attn.qkv.weight", "backbone.branch2.blocks.7.attn.proj.weight", "backbone.branch2.blocks.7.mlp.fc1.weight", "backbone.branch2.blocks.7.mlp.fc2.weight", "backbone.branch3.blocks.7.attn.qkv.weight", "backbone.branch3.blocks.7.attn.proj.weight", "backbone.branch3.blocks.7.mlp.fc1.weight", "backbone.branch3.blocks.7.mlp.fc2.weight" ], "lr_scale": 0.44370531249999995, "lr": 4.4370531249999995e-05, "weight_decay": 0.05 }, "layer_9_no_decay": { "param_names": [ "backbone.branch1.blocks.8.gamma_1", "backbone.branch1.blocks.8.gamma_2", "backbone.branch1.blocks.8.norm1.weight", "backbone.branch1.blocks.8.norm1.bias", "backbone.branch1.blocks.8.attn.qkv.bias", "backbone.branch1.blocks.8.attn.proj.bias", "backbone.branch1.blocks.8.norm2.weight", "backbone.branch1.blocks.8.norm2.bias", "backbone.branch1.blocks.8.mlp.fc1.bias", "backbone.branch1.blocks.8.mlp.fc2.bias", "backbone.branch2.blocks.8.gamma_1", "backbone.branch2.blocks.8.gamma_2", "backbone.branch2.blocks.8.norm1.weight", "backbone.branch2.blocks.8.norm1.bias", "backbone.branch2.blocks.8.attn.qkv.bias", "backbone.branch2.blocks.8.attn.proj.bias", "backbone.branch2.blocks.8.norm2.weight", "backbone.branch2.blocks.8.norm2.bias", "backbone.branch2.blocks.8.mlp.fc1.bias", "backbone.branch2.blocks.8.mlp.fc2.bias", "backbone.branch3.blocks.8.gamma_1", "backbone.branch3.blocks.8.gamma_2", "backbone.branch3.blocks.8.norm1.weight", "backbone.branch3.blocks.8.norm1.bias", "backbone.branch3.blocks.8.attn.qkv.bias", "backbone.branch3.blocks.8.attn.proj.bias", "backbone.branch3.blocks.8.norm2.weight", "backbone.branch3.blocks.8.norm2.bias", "backbone.branch3.blocks.8.mlp.fc1.bias", "backbone.branch3.blocks.8.mlp.fc2.bias" ], "lr_scale": 0.5220062499999999, "lr": 5.220062499999999e-05, "weight_decay": 0.0 }, "layer_9_decay": { "param_names": [ "backbone.branch1.blocks.8.attn.qkv.weight", "backbone.branch1.blocks.8.attn.proj.weight", "backbone.branch1.blocks.8.mlp.fc1.weight", "backbone.branch1.blocks.8.mlp.fc2.weight", "backbone.branch2.blocks.8.attn.qkv.weight", "backbone.branch2.blocks.8.attn.proj.weight", "backbone.branch2.blocks.8.mlp.fc1.weight", "backbone.branch2.blocks.8.mlp.fc2.weight", "backbone.branch3.blocks.8.attn.qkv.weight", "backbone.branch3.blocks.8.attn.proj.weight", "backbone.branch3.blocks.8.mlp.fc1.weight", "backbone.branch3.blocks.8.mlp.fc2.weight" ], "lr_scale": 0.5220062499999999, "lr": 5.220062499999999e-05, "weight_decay": 0.05 }, "layer_10_no_decay": { "param_names": [ "backbone.branch1.blocks.9.gamma_1", "backbone.branch1.blocks.9.gamma_2", "backbone.branch1.blocks.9.norm1.weight", "backbone.branch1.blocks.9.norm1.bias", "backbone.branch1.blocks.9.attn.qkv.bias", "backbone.branch1.blocks.9.attn.proj.bias", "backbone.branch1.blocks.9.norm2.weight", "backbone.branch1.blocks.9.norm2.bias", "backbone.branch1.blocks.9.mlp.fc1.bias", "backbone.branch1.blocks.9.mlp.fc2.bias", "backbone.branch2.blocks.9.gamma_1", "backbone.branch2.blocks.9.gamma_2", "backbone.branch2.blocks.9.norm1.weight", "backbone.branch2.blocks.9.norm1.bias", "backbone.branch2.blocks.9.attn.qkv.bias", "backbone.branch2.blocks.9.attn.proj.bias", "backbone.branch2.blocks.9.norm2.weight", "backbone.branch2.blocks.9.norm2.bias", "backbone.branch2.blocks.9.mlp.fc1.bias", "backbone.branch2.blocks.9.mlp.fc2.bias", "backbone.branch3.blocks.9.gamma_1", "backbone.branch3.blocks.9.gamma_2", "backbone.branch3.blocks.9.norm1.weight", "backbone.branch3.blocks.9.norm1.bias", "backbone.branch3.blocks.9.attn.qkv.bias", "backbone.branch3.blocks.9.attn.proj.bias", "backbone.branch3.blocks.9.norm2.weight", "backbone.branch3.blocks.9.norm2.bias", "backbone.branch3.blocks.9.mlp.fc1.bias", "backbone.branch3.blocks.9.mlp.fc2.bias" ], "lr_scale": 0.6141249999999999, "lr": 6.14125e-05, "weight_decay": 0.0 }, "layer_10_decay": { "param_names": [ "backbone.branch1.blocks.9.attn.qkv.weight", "backbone.branch1.blocks.9.attn.proj.weight", "backbone.branch1.blocks.9.mlp.fc1.weight", "backbone.branch1.blocks.9.mlp.fc2.weight", "backbone.branch2.blocks.9.attn.qkv.weight", "backbone.branch2.blocks.9.attn.proj.weight", "backbone.branch2.blocks.9.mlp.fc1.weight", "backbone.branch2.blocks.9.mlp.fc2.weight", "backbone.branch3.blocks.9.attn.qkv.weight", "backbone.branch3.blocks.9.attn.proj.weight", "backbone.branch3.blocks.9.mlp.fc1.weight", "backbone.branch3.blocks.9.mlp.fc2.weight" ], "lr_scale": 0.6141249999999999, "lr": 6.14125e-05, "weight_decay": 0.05 }, "layer_11_no_decay": { "param_names": [ "backbone.branch1.blocks.10.gamma_1", "backbone.branch1.blocks.10.gamma_2", "backbone.branch1.blocks.10.norm1.weight", "backbone.branch1.blocks.10.norm1.bias", "backbone.branch1.blocks.10.attn.qkv.bias", "backbone.branch1.blocks.10.attn.proj.bias", "backbone.branch1.blocks.10.norm2.weight", "backbone.branch1.blocks.10.norm2.bias", "backbone.branch1.blocks.10.mlp.fc1.bias", "backbone.branch1.blocks.10.mlp.fc2.bias", "backbone.branch2.blocks.10.gamma_1", "backbone.branch2.blocks.10.gamma_2", "backbone.branch2.blocks.10.norm1.weight", "backbone.branch2.blocks.10.norm1.bias", "backbone.branch2.blocks.10.attn.qkv.bias", "backbone.branch2.blocks.10.attn.proj.bias", "backbone.branch2.blocks.10.norm2.weight", "backbone.branch2.blocks.10.norm2.bias", "backbone.branch2.blocks.10.mlp.fc1.bias", "backbone.branch2.blocks.10.mlp.fc2.bias", "backbone.branch3.blocks.10.gamma_1", "backbone.branch3.blocks.10.gamma_2", "backbone.branch3.blocks.10.norm1.weight", "backbone.branch3.blocks.10.norm1.bias", "backbone.branch3.blocks.10.attn.qkv.bias", "backbone.branch3.blocks.10.attn.proj.bias", "backbone.branch3.blocks.10.norm2.weight", "backbone.branch3.blocks.10.norm2.bias", "backbone.branch3.blocks.10.mlp.fc1.bias", "backbone.branch3.blocks.10.mlp.fc2.bias" ], "lr_scale": 0.7224999999999999, "lr": 7.225e-05, "weight_decay": 0.0 }, "layer_11_decay": { "param_names": [ "backbone.branch1.blocks.10.attn.qkv.weight", "backbone.branch1.blocks.10.attn.proj.weight", "backbone.branch1.blocks.10.mlp.fc1.weight", "backbone.branch1.blocks.10.mlp.fc2.weight", "backbone.branch2.blocks.10.attn.qkv.weight", "backbone.branch2.blocks.10.attn.proj.weight", "backbone.branch2.blocks.10.mlp.fc1.weight", "backbone.branch2.blocks.10.mlp.fc2.weight", "backbone.branch3.blocks.10.attn.qkv.weight", "backbone.branch3.blocks.10.attn.proj.weight", "backbone.branch3.blocks.10.mlp.fc1.weight", "backbone.branch3.blocks.10.mlp.fc2.weight" ], "lr_scale": 0.7224999999999999, "lr": 7.225e-05, "weight_decay": 0.05 }, "layer_12_no_decay": { "param_names": [ "backbone.branch1.blocks.11.gamma_1", "backbone.branch1.blocks.11.gamma_2", "backbone.branch1.blocks.11.norm1.weight", "backbone.branch1.blocks.11.norm1.bias", "backbone.branch1.blocks.11.attn.qkv.bias", "backbone.branch1.blocks.11.attn.proj.bias", "backbone.branch1.blocks.11.norm2.weight", "backbone.branch1.blocks.11.norm2.bias", "backbone.branch1.blocks.11.mlp.fc1.bias", "backbone.branch1.blocks.11.mlp.fc2.bias", "backbone.branch2.blocks.11.gamma_1", "backbone.branch2.blocks.11.gamma_2", "backbone.branch2.blocks.11.norm1.weight", "backbone.branch2.blocks.11.norm1.bias", "backbone.branch2.blocks.11.attn.qkv.bias", "backbone.branch2.blocks.11.attn.proj.bias", "backbone.branch2.blocks.11.norm2.weight", "backbone.branch2.blocks.11.norm2.bias", "backbone.branch2.blocks.11.mlp.fc1.bias", "backbone.branch2.blocks.11.mlp.fc2.bias", "backbone.branch3.blocks.11.gamma_1", "backbone.branch3.blocks.11.gamma_2", "backbone.branch3.blocks.11.norm1.weight", "backbone.branch3.blocks.11.norm1.bias", "backbone.branch3.blocks.11.attn.qkv.bias", "backbone.branch3.blocks.11.attn.proj.bias", "backbone.branch3.blocks.11.norm2.weight", "backbone.branch3.blocks.11.norm2.bias", "backbone.branch3.blocks.11.mlp.fc1.bias", "backbone.branch3.blocks.11.mlp.fc2.bias" ], "lr_scale": 0.85, "lr": 8.5e-05, "weight_decay": 0.0 }, "layer_12_decay": { "param_names": [ "backbone.branch1.blocks.11.attn.qkv.weight", "backbone.branch1.blocks.11.attn.proj.weight", "backbone.branch1.blocks.11.mlp.fc1.weight", "backbone.branch1.blocks.11.mlp.fc2.weight", "backbone.branch2.blocks.11.attn.qkv.weight", "backbone.branch2.blocks.11.attn.proj.weight", "backbone.branch2.blocks.11.mlp.fc1.weight", "backbone.branch2.blocks.11.mlp.fc2.weight", "backbone.branch3.blocks.11.attn.qkv.weight", "backbone.branch3.blocks.11.attn.proj.weight", "backbone.branch3.blocks.11.mlp.fc1.weight", "backbone.branch3.blocks.11.mlp.fc2.weight" ], "lr_scale": 0.85, "lr": 8.5e-05, "weight_decay": 0.05 }, "layer_13_no_decay": { "param_names": [ "backbone.interactions.0.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.merge_branch1.1.weight", "backbone.merge_branch1.1.bias", "backbone.merge_branch1.4.weight", "backbone.merge_branch1.4.bias", "backbone.merge_branch2.1.weight", "backbone.merge_branch2.1.bias", "backbone.merge_branch2.4.weight", "backbone.merge_branch2.4.bias", "backbone.merge_branch3.1.weight", "backbone.merge_branch3.1.bias", "backbone.merge_branch3.4.weight", "backbone.merge_branch3.4.bias", "backbone.fpn1.0.bias", "backbone.fpn1.1.weight", "backbone.fpn1.1.bias", "backbone.fpn1.3.bias", "backbone.fpn2.0.bias", "neck.lateral_convs.0.conv.bias", "neck.lateral_convs.1.conv.bias", "neck.lateral_convs.2.conv.bias", "neck.lateral_convs.3.conv.bias", "neck.fpn_convs.0.conv.bias", "neck.fpn_convs.1.conv.bias", "neck.fpn_convs.2.conv.bias", "neck.fpn_convs.3.conv.bias", "rpn_head.rpn_conv.bias", "rpn_head.rpn_cls.bias", "rpn_head.rpn_reg.bias", "roi_head.bbox_head.fc_cls.bias", "roi_head.bbox_head.fc_reg.bias", "roi_head.bbox_head.shared_fcs.0.bias", "roi_head.bbox_head.shared_fcs.1.bias", "roi_head.mask_head.convs.0.conv.bias", "roi_head.mask_head.convs.1.conv.bias", "roi_head.mask_head.convs.2.conv.bias", "roi_head.mask_head.convs.3.conv.bias", "roi_head.mask_head.upsample.bias", "roi_head.mask_head.conv_logits.bias" ], "lr_scale": 1.0, "lr": 0.0001, "weight_decay": 0.0 } } 2024-05-29 10:15:40,192 - mmdet - INFO - Automatic scaling of learning rate (LR) has been disabled. 2024-05-29 10:15:40,603 - mmdet - INFO - Start running, work_dir: /mnt/petrelfs/PIIP/mmdetection/work_dirs/mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16 2024-05-29 10:15:40,603 - mmdet - INFO - Hooks will be executed in the following order: before_run: (VERY_HIGH ) StepLrUpdaterHook (49 ) ToBFloat16HookMMDet (NORMAL ) DeepspeedCheckpointHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_epoch: (VERY_HIGH ) StepLrUpdaterHook (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_iter: (VERY_HIGH ) StepLrUpdaterHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook -------------------- after_train_iter: (ABOVE_NORMAL) OptimizerHook (NORMAL ) DeepspeedCheckpointHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- after_train_epoch: (NORMAL ) DeepspeedCheckpointHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_val_epoch: (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (VERY_LOW ) TextLoggerHook -------------------- before_val_iter: (LOW ) IterTimerHook -------------------- after_val_iter: (LOW ) IterTimerHook -------------------- after_val_epoch: (VERY_LOW ) TextLoggerHook -------------------- after_run: (VERY_LOW ) TextLoggerHook -------------------- 2024-05-29 10:15:40,603 - mmdet - INFO - workflow: [('train', 1)], max: 12 epochs 2024-05-29 10:15:40,615 - mmdet - INFO - Checkpoints will be saved to /mnt/petrelfs/PIIP/mmdetection/work_dirs/mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16 by HardDiskBackend. 2024-05-29 10:16:21,272 - mmdet - INFO - Epoch [1][50/7330] lr: 9.890e-06, eta: 19:51:09, time: 0.813, data_time: 0.127, memory: 11004, loss_rpn_cls: 0.5860, loss_rpn_bbox: 0.1245, loss_cls: 2.4613, acc: 56.5544, loss_bbox: 0.0481, loss_mask: 1.1470, loss: 4.3669 2024-05-29 10:16:55,647 - mmdet - INFO - Epoch [1][100/7330] lr: 1.988e-05, eta: 18:18:36, time: 0.688, data_time: 0.066, memory: 11445, loss_rpn_cls: 0.2933, loss_rpn_bbox: 0.1089, loss_cls: 0.4070, acc: 95.4822, loss_bbox: 0.1409, loss_mask: 0.7561, loss: 1.7062 2024-05-29 10:17:28,793 - mmdet - INFO - Epoch [1][150/7330] lr: 2.987e-05, eta: 17:35:20, time: 0.663, data_time: 0.063, memory: 11445, loss_rpn_cls: 0.2291, loss_rpn_bbox: 0.0992, loss_cls: 0.3369, acc: 95.2214, loss_bbox: 0.1538, loss_mask: 0.6953, loss: 1.5142 2024-05-29 10:18:01,338 - mmdet - INFO - Epoch [1][200/7330] lr: 3.986e-05, eta: 17:09:04, time: 0.651, data_time: 0.056, memory: 11445, loss_rpn_cls: 0.2037, loss_rpn_bbox: 0.0991, loss_cls: 0.3044, acc: 95.4468, loss_bbox: 0.1469, loss_mask: 0.6802, loss: 1.4343 2024-05-29 10:18:35,483 - mmdet - INFO - Epoch [1][250/7330] lr: 4.985e-05, eta: 17:02:26, time: 0.683, data_time: 0.061, memory: 11445, loss_rpn_cls: 0.1836, loss_rpn_bbox: 0.0987, loss_cls: 0.3559, acc: 94.6577, loss_bbox: 0.1743, loss_mask: 0.6658, loss: 1.4783 2024-05-29 10:19:08,165 - mmdet - INFO - Epoch [1][300/7330] lr: 5.984e-05, eta: 16:50:42, time: 0.654, data_time: 0.064, memory: 11559, loss_rpn_cls: 0.1529, loss_rpn_bbox: 0.1016, loss_cls: 0.4173, acc: 93.6287, loss_bbox: 0.2248, loss_mask: 0.6447, loss: 1.5412 2024-05-29 10:19:42,087 - mmdet - INFO - Epoch [1][350/7330] lr: 6.983e-05, eta: 16:47:21, time: 0.678, data_time: 0.064, memory: 11559, loss_rpn_cls: 0.1348, loss_rpn_bbox: 0.0988, loss_cls: 0.4474, acc: 92.9177, loss_bbox: 0.2523, loss_mask: 0.6289, loss: 1.5621 2024-05-29 10:20:14,801 - mmdet - INFO - Epoch [1][400/7330] lr: 7.982e-05, eta: 16:40:16, time: 0.654, data_time: 0.057, memory: 11559, loss_rpn_cls: 0.1159, loss_rpn_bbox: 0.0944, loss_cls: 0.4641, acc: 92.6077, loss_bbox: 0.2649, loss_mask: 0.6005, loss: 1.5399 2024-05-29 10:20:48,459 - mmdet - INFO - Epoch [1][450/7330] lr: 8.981e-05, eta: 16:37:42, time: 0.673, data_time: 0.069, memory: 11625, loss_rpn_cls: 0.1221, loss_rpn_bbox: 0.1024, loss_cls: 0.4624, acc: 92.2283, loss_bbox: 0.2732, loss_mask: 0.5672, loss: 1.5273 2024-05-29 10:21:22,236 - mmdet - INFO - Epoch [1][500/7330] lr: 9.980e-05, eta: 16:35:54, time: 0.676, data_time: 0.060, memory: 11625, loss_rpn_cls: 0.1090, loss_rpn_bbox: 0.0991, loss_cls: 0.4780, acc: 91.2595, loss_bbox: 0.3057, loss_mask: 0.5392, loss: 1.5310 2024-05-29 10:21:55,401 - mmdet - INFO - Epoch [1][550/7330] lr: 1.000e-04, eta: 16:32:39, time: 0.663, data_time: 0.062, memory: 11625, loss_rpn_cls: 0.0972, loss_rpn_bbox: 0.0931, loss_cls: 0.4537, acc: 91.5205, loss_bbox: 0.3009, loss_mask: 0.5092, loss: 1.4541 2024-05-29 10:22:41,035 - mmdet - INFO - Epoch [1][600/7330] lr: 1.000e-04, eta: 17:00:11, time: 0.913, data_time: 0.066, memory: 11625, loss_rpn_cls: 0.0919, loss_rpn_bbox: 0.0908, loss_cls: 0.4533, acc: 91.1479, loss_bbox: 0.3108, loss_mask: 0.5013, loss: 1.4482 2024-05-29 10:23:14,503 - mmdet - INFO - Epoch [1][650/7330] lr: 1.000e-04, eta: 16:56:05, time: 0.669, data_time: 0.054, memory: 11626, loss_rpn_cls: 0.0870, loss_rpn_bbox: 0.0881, loss_cls: 0.4274, acc: 91.3901, loss_bbox: 0.3030, loss_mask: 0.4960, loss: 1.4015 2024-05-29 10:23:48,834 - mmdet - INFO - Epoch [1][700/7330] lr: 1.000e-04, eta: 16:54:18, time: 0.687, data_time: 0.054, memory: 11626, loss_rpn_cls: 0.0867, loss_rpn_bbox: 0.0886, loss_cls: 0.4328, acc: 90.8079, loss_bbox: 0.3216, loss_mask: 0.4767, loss: 1.4065 2024-05-29 10:24:22,578 - mmdet - INFO - Epoch [1][750/7330] lr: 1.000e-04, eta: 16:51:31, time: 0.675, data_time: 0.065, memory: 11626, loss_rpn_cls: 0.0797, loss_rpn_bbox: 0.0867, loss_cls: 0.4265, acc: 90.4641, loss_bbox: 0.3352, loss_mask: 0.4670, loss: 1.3951 2024-05-29 10:24:56,021 - mmdet - INFO - Epoch [1][800/7330] lr: 1.000e-04, eta: 16:48:29, time: 0.669, data_time: 0.062, memory: 11626, loss_rpn_cls: 0.0775, loss_rpn_bbox: 0.0876, loss_cls: 0.4116, acc: 90.6301, loss_bbox: 0.3303, loss_mask: 0.4518, loss: 1.3589 2024-05-29 10:25:30,009 - mmdet - INFO - Epoch [1][850/7330] lr: 1.000e-04, eta: 16:46:40, time: 0.680, data_time: 0.062, memory: 11626, loss_rpn_cls: 0.0771, loss_rpn_bbox: 0.0894, loss_cls: 0.4102, acc: 90.3567, loss_bbox: 0.3405, loss_mask: 0.4457, loss: 1.3629 2024-05-29 10:26:03,609 - mmdet - INFO - Epoch [1][900/7330] lr: 1.000e-04, eta: 16:44:22, time: 0.672, data_time: 0.055, memory: 11626, loss_rpn_cls: 0.0834, loss_rpn_bbox: 0.0874, loss_cls: 0.4175, acc: 90.0603, loss_bbox: 0.3481, loss_mask: 0.4410, loss: 1.3774 2024-05-29 10:26:37,375 - mmdet - INFO - Epoch [1][950/7330] lr: 1.000e-04, eta: 16:42:30, time: 0.675, data_time: 0.065, memory: 11626, loss_rpn_cls: 0.0775, loss_rpn_bbox: 0.0885, loss_cls: 0.4074, acc: 90.2666, loss_bbox: 0.3354, loss_mask: 0.4348, loss: 1.3436 2024-05-29 10:27:11,779 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 10:27:11,780 - mmdet - INFO - Epoch [1][1000/7330] lr: 1.000e-04, eta: 16:41:42, time: 0.688, data_time: 0.066, memory: 11626, loss_rpn_cls: 0.0783, loss_rpn_bbox: 0.0898, loss_cls: 0.3834, acc: 90.4282, loss_bbox: 0.3343, loss_mask: 0.4267, loss: 1.3125 2024-05-29 10:27:46,204 - mmdet - INFO - Epoch [1][1050/7330] lr: 1.000e-04, eta: 16:40:56, time: 0.688, data_time: 0.082, memory: 11626, loss_rpn_cls: 0.0691, loss_rpn_bbox: 0.0890, loss_cls: 0.4059, acc: 89.4800, loss_bbox: 0.3665, loss_mask: 0.4304, loss: 1.3609 2024-05-29 10:28:20,110 - mmdet - INFO - Epoch [1][1100/7330] lr: 1.000e-04, eta: 16:39:31, time: 0.678, data_time: 0.048, memory: 11626, loss_rpn_cls: 0.0635, loss_rpn_bbox: 0.0791, loss_cls: 0.3687, acc: 90.6482, loss_bbox: 0.3261, loss_mask: 0.4200, loss: 1.2573 2024-05-29 10:28:53,987 - mmdet - INFO - Epoch [1][1150/7330] lr: 1.000e-04, eta: 16:38:07, time: 0.677, data_time: 0.058, memory: 11626, loss_rpn_cls: 0.0675, loss_rpn_bbox: 0.0832, loss_cls: 0.3841, acc: 90.0081, loss_bbox: 0.3447, loss_mask: 0.4186, loss: 1.2981 2024-05-29 10:29:27,314 - mmdet - INFO - Epoch [1][1200/7330] lr: 1.000e-04, eta: 16:36:08, time: 0.667, data_time: 0.061, memory: 11626, loss_rpn_cls: 0.0645, loss_rpn_bbox: 0.0858, loss_cls: 0.3828, acc: 89.9050, loss_bbox: 0.3559, loss_mask: 0.4115, loss: 1.3005 2024-05-29 10:30:00,326 - mmdet - INFO - Epoch [1][1250/7330] lr: 1.000e-04, eta: 16:33:55, time: 0.660, data_time: 0.060, memory: 11626, loss_rpn_cls: 0.0666, loss_rpn_bbox: 0.0798, loss_cls: 0.3617, acc: 90.3499, loss_bbox: 0.3340, loss_mask: 0.4046, loss: 1.2468 2024-05-29 10:30:33,932 - mmdet - INFO - Epoch [1][1300/7330] lr: 1.000e-04, eta: 16:32:28, time: 0.672, data_time: 0.057, memory: 11626, loss_rpn_cls: 0.0657, loss_rpn_bbox: 0.0778, loss_cls: 0.3480, acc: 90.6946, loss_bbox: 0.3190, loss_mask: 0.4017, loss: 1.2122 2024-05-29 10:31:10,957 - mmdet - INFO - Epoch [1][1350/7330] lr: 1.000e-04, eta: 16:34:44, time: 0.740, data_time: 0.074, memory: 11626, loss_rpn_cls: 0.0669, loss_rpn_bbox: 0.0851, loss_cls: 0.3726, acc: 89.7637, loss_bbox: 0.3542, loss_mask: 0.3998, loss: 1.2786 2024-05-29 10:31:57,179 - mmdet - INFO - Epoch [1][1400/7330] lr: 1.000e-04, eta: 16:46:18, time: 0.925, data_time: 0.059, memory: 11626, loss_rpn_cls: 0.0644, loss_rpn_bbox: 0.0788, loss_cls: 0.3542, acc: 90.0479, loss_bbox: 0.3493, loss_mask: 0.3972, loss: 1.2440 2024-05-29 10:32:31,552 - mmdet - INFO - Epoch [1][1450/7330] lr: 1.000e-04, eta: 16:45:13, time: 0.687, data_time: 0.046, memory: 11626, loss_rpn_cls: 0.0608, loss_rpn_bbox: 0.0797, loss_cls: 0.3623, acc: 89.9182, loss_bbox: 0.3475, loss_mask: 0.3948, loss: 1.2451 2024-05-29 10:33:06,007 - mmdet - INFO - Epoch [1][1500/7330] lr: 1.000e-04, eta: 16:44:15, time: 0.689, data_time: 0.058, memory: 11626, loss_rpn_cls: 0.0576, loss_rpn_bbox: 0.0756, loss_cls: 0.3578, acc: 90.0129, loss_bbox: 0.3500, loss_mask: 0.3910, loss: 1.2319 2024-05-29 10:33:40,584 - mmdet - INFO - Epoch [1][1550/7330] lr: 1.000e-04, eta: 16:43:25, time: 0.692, data_time: 0.072, memory: 11626, loss_rpn_cls: 0.0551, loss_rpn_bbox: 0.0772, loss_cls: 0.3401, acc: 90.3931, loss_bbox: 0.3276, loss_mask: 0.3800, loss: 1.1799 2024-05-29 10:34:14,605 - mmdet - INFO - Epoch [1][1600/7330] lr: 1.000e-04, eta: 16:42:06, time: 0.680, data_time: 0.060, memory: 11626, loss_rpn_cls: 0.0589, loss_rpn_bbox: 0.0812, loss_cls: 0.3522, acc: 89.9895, loss_bbox: 0.3478, loss_mask: 0.3846, loss: 1.2247 2024-05-29 10:34:48,438 - mmdet - INFO - Epoch [1][1650/7330] lr: 1.000e-04, eta: 16:40:40, time: 0.677, data_time: 0.056, memory: 11626, loss_rpn_cls: 0.0580, loss_rpn_bbox: 0.0759, loss_cls: 0.3392, acc: 90.3232, loss_bbox: 0.3397, loss_mask: 0.3826, loss: 1.1955 2024-05-29 10:35:23,420 - mmdet - INFO - Epoch [1][1700/7330] lr: 1.000e-04, eta: 16:40:15, time: 0.700, data_time: 0.074, memory: 11626, loss_rpn_cls: 0.0564, loss_rpn_bbox: 0.0839, loss_cls: 0.3502, acc: 89.8918, loss_bbox: 0.3543, loss_mask: 0.3817, loss: 1.2265 2024-05-29 10:35:57,374 - mmdet - INFO - Epoch [1][1750/7330] lr: 1.000e-04, eta: 16:38:59, time: 0.679, data_time: 0.063, memory: 11626, loss_rpn_cls: 0.0496, loss_rpn_bbox: 0.0732, loss_cls: 0.3407, acc: 90.0725, loss_bbox: 0.3444, loss_mask: 0.3715, loss: 1.1794 2024-05-29 10:36:31,021 - mmdet - INFO - Epoch [1][1800/7330] lr: 1.000e-04, eta: 16:37:31, time: 0.673, data_time: 0.060, memory: 11626, loss_rpn_cls: 0.0563, loss_rpn_bbox: 0.0805, loss_cls: 0.3555, acc: 89.8638, loss_bbox: 0.3539, loss_mask: 0.3831, loss: 1.2292 2024-05-29 10:37:05,379 - mmdet - INFO - Epoch [1][1850/7330] lr: 1.000e-04, eta: 16:36:39, time: 0.687, data_time: 0.067, memory: 11626, loss_rpn_cls: 0.0531, loss_rpn_bbox: 0.0734, loss_cls: 0.3392, acc: 90.3257, loss_bbox: 0.3338, loss_mask: 0.3720, loss: 1.1714 2024-05-29 10:37:38,565 - mmdet - INFO - Epoch [1][1900/7330] lr: 1.000e-04, eta: 16:34:55, time: 0.664, data_time: 0.051, memory: 11626, loss_rpn_cls: 0.0498, loss_rpn_bbox: 0.0748, loss_cls: 0.3329, acc: 90.1936, loss_bbox: 0.3410, loss_mask: 0.3755, loss: 1.1740 2024-05-29 10:38:12,662 - mmdet - INFO - Epoch [1][1950/7330] lr: 1.000e-04, eta: 16:33:54, time: 0.682, data_time: 0.060, memory: 11626, loss_rpn_cls: 0.0614, loss_rpn_bbox: 0.0774, loss_cls: 0.3171, acc: 90.5835, loss_bbox: 0.3269, loss_mask: 0.3668, loss: 1.1496 2024-05-29 10:38:46,662 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 10:38:46,662 - mmdet - INFO - Epoch [1][2000/7330] lr: 1.000e-04, eta: 16:32:51, time: 0.680, data_time: 0.066, memory: 11626, loss_rpn_cls: 0.0557, loss_rpn_bbox: 0.0772, loss_cls: 0.3324, acc: 90.1624, loss_bbox: 0.3414, loss_mask: 0.3717, loss: 1.1785 2024-05-29 10:39:21,134 - mmdet - INFO - Epoch [1][2050/7330] lr: 1.000e-04, eta: 16:32:09, time: 0.689, data_time: 0.072, memory: 11626, loss_rpn_cls: 0.0568, loss_rpn_bbox: 0.0748, loss_cls: 0.3252, acc: 90.2517, loss_bbox: 0.3382, loss_mask: 0.3634, loss: 1.1584 2024-05-29 10:39:58,101 - mmdet - INFO - Epoch [1][2100/7330] lr: 1.000e-04, eta: 16:33:09, time: 0.739, data_time: 0.066, memory: 11626, loss_rpn_cls: 0.0550, loss_rpn_bbox: 0.0782, loss_cls: 0.3370, acc: 89.9839, loss_bbox: 0.3462, loss_mask: 0.3667, loss: 1.1831 2024-05-29 10:40:46,212 - mmdet - INFO - Epoch [1][2150/7330] lr: 1.000e-04, eta: 16:41:29, time: 0.962, data_time: 0.068, memory: 11626, loss_rpn_cls: 0.0567, loss_rpn_bbox: 0.0738, loss_cls: 0.3284, acc: 90.2756, loss_bbox: 0.3388, loss_mask: 0.3531, loss: 1.1509 2024-05-29 10:41:21,023 - mmdet - INFO - Epoch [1][2200/7330] lr: 1.000e-04, eta: 16:40:46, time: 0.696, data_time: 0.066, memory: 11626, loss_rpn_cls: 0.0561, loss_rpn_bbox: 0.0774, loss_cls: 0.3329, acc: 89.8599, loss_bbox: 0.3467, loss_mask: 0.3597, loss: 1.1727 2024-05-29 10:41:54,541 - mmdet - INFO - Epoch [1][2250/7330] lr: 1.000e-04, eta: 16:39:14, time: 0.670, data_time: 0.073, memory: 11626, loss_rpn_cls: 0.0554, loss_rpn_bbox: 0.0751, loss_cls: 0.3261, acc: 90.2073, loss_bbox: 0.3389, loss_mask: 0.3605, loss: 1.1559 2024-05-29 10:42:28,500 - mmdet - INFO - Epoch [1][2300/7330] lr: 1.000e-04, eta: 16:38:02, time: 0.679, data_time: 0.060, memory: 11626, loss_rpn_cls: 0.0527, loss_rpn_bbox: 0.0792, loss_cls: 0.3295, acc: 89.9548, loss_bbox: 0.3480, loss_mask: 0.3573, loss: 1.1666 2024-05-29 10:43:02,750 - mmdet - INFO - Epoch [1][2350/7330] lr: 1.000e-04, eta: 16:37:01, time: 0.685, data_time: 0.060, memory: 11626, loss_rpn_cls: 0.0511, loss_rpn_bbox: 0.0748, loss_cls: 0.3360, acc: 89.8345, loss_bbox: 0.3530, loss_mask: 0.3645, loss: 1.1793 2024-05-29 10:43:36,775 - mmdet - INFO - Epoch [1][2400/7330] lr: 1.000e-04, eta: 16:35:54, time: 0.681, data_time: 0.056, memory: 11626, loss_rpn_cls: 0.0512, loss_rpn_bbox: 0.0737, loss_cls: 0.3256, acc: 90.1487, loss_bbox: 0.3384, loss_mask: 0.3597, loss: 1.1486 2024-05-29 10:44:10,931 - mmdet - INFO - Epoch [1][2450/7330] lr: 1.000e-04, eta: 16:34:51, time: 0.683, data_time: 0.058, memory: 11626, loss_rpn_cls: 0.0513, loss_rpn_bbox: 0.0753, loss_cls: 0.3306, acc: 89.9673, loss_bbox: 0.3476, loss_mask: 0.3630, loss: 1.1678 2024-05-29 10:44:44,703 - mmdet - INFO - Epoch [1][2500/7330] lr: 1.000e-04, eta: 16:33:38, time: 0.676, data_time: 0.061, memory: 11626, loss_rpn_cls: 0.0535, loss_rpn_bbox: 0.0765, loss_cls: 0.3257, acc: 90.1306, loss_bbox: 0.3382, loss_mask: 0.3666, loss: 1.1605 2024-05-29 10:45:19,153 - mmdet - INFO - Epoch [1][2550/7330] lr: 1.000e-04, eta: 16:32:49, time: 0.689, data_time: 0.060, memory: 11626, loss_rpn_cls: 0.0537, loss_rpn_bbox: 0.0733, loss_cls: 0.3216, acc: 90.1770, loss_bbox: 0.3411, loss_mask: 0.3528, loss: 1.1424 2024-05-29 10:45:53,437 - mmdet - INFO - Epoch [1][2600/7330] lr: 1.000e-04, eta: 16:31:55, time: 0.686, data_time: 0.063, memory: 11626, loss_rpn_cls: 0.0505, loss_rpn_bbox: 0.0749, loss_cls: 0.3182, acc: 90.3193, loss_bbox: 0.3374, loss_mask: 0.3530, loss: 1.1340 2024-05-29 10:46:27,729 - mmdet - INFO - Epoch [1][2650/7330] lr: 1.000e-04, eta: 16:31:02, time: 0.686, data_time: 0.055, memory: 11626, loss_rpn_cls: 0.0517, loss_rpn_bbox: 0.0752, loss_cls: 0.3406, acc: 89.6956, loss_bbox: 0.3556, loss_mask: 0.3570, loss: 1.1800 2024-05-29 10:47:02,542 - mmdet - INFO - Epoch [1][2700/7330] lr: 1.000e-04, eta: 16:30:26, time: 0.696, data_time: 0.085, memory: 11626, loss_rpn_cls: 0.0519, loss_rpn_bbox: 0.0750, loss_cls: 0.3226, acc: 90.2605, loss_bbox: 0.3386, loss_mask: 0.3538, loss: 1.1419 2024-05-29 10:47:37,285 - mmdet - INFO - Epoch [1][2750/7330] lr: 1.000e-04, eta: 16:29:47, time: 0.695, data_time: 0.058, memory: 11626, loss_rpn_cls: 0.0509, loss_rpn_bbox: 0.0727, loss_cls: 0.3289, acc: 90.0410, loss_bbox: 0.3494, loss_mask: 0.3493, loss: 1.1512 2024-05-29 10:48:10,562 - mmdet - INFO - Epoch [1][2800/7330] lr: 1.000e-04, eta: 16:28:25, time: 0.666, data_time: 0.059, memory: 11626, loss_rpn_cls: 0.0466, loss_rpn_bbox: 0.0696, loss_cls: 0.3087, acc: 90.4951, loss_bbox: 0.3322, loss_mask: 0.3411, loss: 1.0982 2024-05-29 10:48:44,132 - mmdet - INFO - Epoch [1][2850/7330] lr: 1.000e-04, eta: 16:27:13, time: 0.671, data_time: 0.051, memory: 11626, loss_rpn_cls: 0.0451, loss_rpn_bbox: 0.0709, loss_cls: 0.3076, acc: 90.5955, loss_bbox: 0.3262, loss_mask: 0.3428, loss: 1.0926 2024-05-29 10:49:30,993 - mmdet - INFO - Epoch [1][2900/7330] lr: 1.000e-04, eta: 16:32:32, time: 0.937, data_time: 0.063, memory: 11626, loss_rpn_cls: 0.0494, loss_rpn_bbox: 0.0696, loss_cls: 0.3080, acc: 90.6724, loss_bbox: 0.3261, loss_mask: 0.3453, loss: 1.0984 2024-05-29 10:50:05,496 - mmdet - INFO - Epoch [1][2950/7330] lr: 1.000e-04, eta: 16:31:42, time: 0.690, data_time: 0.050, memory: 11626, loss_rpn_cls: 0.0491, loss_rpn_bbox: 0.0723, loss_cls: 0.3118, acc: 89.9741, loss_bbox: 0.3501, loss_mask: 0.3419, loss: 1.1252 2024-05-29 10:50:39,581 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 10:50:39,581 - mmdet - INFO - Epoch [1][3000/7330] lr: 1.000e-04, eta: 16:30:41, time: 0.682, data_time: 0.049, memory: 11626, loss_rpn_cls: 0.0482, loss_rpn_bbox: 0.0715, loss_cls: 0.3019, acc: 90.5486, loss_bbox: 0.3292, loss_mask: 0.3443, loss: 1.0951 2024-05-29 10:51:13,524 - mmdet - INFO - Epoch [1][3050/7330] lr: 1.000e-04, eta: 16:29:37, time: 0.679, data_time: 0.051, memory: 11626, loss_rpn_cls: 0.0479, loss_rpn_bbox: 0.0698, loss_cls: 0.3095, acc: 90.5803, loss_bbox: 0.3270, loss_mask: 0.3426, loss: 1.0968 2024-05-29 10:51:47,574 - mmdet - INFO - Epoch [1][3100/7330] lr: 1.000e-04, eta: 16:28:37, time: 0.681, data_time: 0.052, memory: 11626, loss_rpn_cls: 0.0459, loss_rpn_bbox: 0.0668, loss_cls: 0.3060, acc: 90.4895, loss_bbox: 0.3286, loss_mask: 0.3370, loss: 1.0844 2024-05-29 10:52:21,806 - mmdet - INFO - Epoch [1][3150/7330] lr: 1.000e-04, eta: 16:27:43, time: 0.685, data_time: 0.070, memory: 11626, loss_rpn_cls: 0.0498, loss_rpn_bbox: 0.0735, loss_cls: 0.3073, acc: 90.4897, loss_bbox: 0.3288, loss_mask: 0.3439, loss: 1.1032 2024-05-29 10:52:55,552 - mmdet - INFO - Epoch [1][3200/7330] lr: 1.000e-04, eta: 16:26:36, time: 0.675, data_time: 0.053, memory: 11626, loss_rpn_cls: 0.0495, loss_rpn_bbox: 0.0720, loss_cls: 0.3125, acc: 90.1287, loss_bbox: 0.3396, loss_mask: 0.3422, loss: 1.1157 2024-05-29 10:53:29,413 - mmdet - INFO - Epoch [1][3250/7330] lr: 1.000e-04, eta: 16:25:34, time: 0.677, data_time: 0.065, memory: 11626, loss_rpn_cls: 0.0461, loss_rpn_bbox: 0.0700, loss_cls: 0.3086, acc: 90.4568, loss_bbox: 0.3308, loss_mask: 0.3402, loss: 1.0957 2024-05-29 10:54:03,518 - mmdet - INFO - Epoch [1][3300/7330] lr: 1.000e-04, eta: 16:24:39, time: 0.682, data_time: 0.056, memory: 11626, loss_rpn_cls: 0.0475, loss_rpn_bbox: 0.0724, loss_cls: 0.3216, acc: 90.0627, loss_bbox: 0.3438, loss_mask: 0.3351, loss: 1.1204 2024-05-29 10:54:37,325 - mmdet - INFO - Epoch [1][3350/7330] lr: 1.000e-04, eta: 16:23:36, time: 0.676, data_time: 0.068, memory: 11626, loss_rpn_cls: 0.0458, loss_rpn_bbox: 0.0704, loss_cls: 0.2958, acc: 90.7683, loss_bbox: 0.3215, loss_mask: 0.3452, loss: 1.0787 2024-05-29 10:55:11,518 - mmdet - INFO - Epoch [1][3400/7330] lr: 1.000e-04, eta: 16:22:44, time: 0.684, data_time: 0.064, memory: 11626, loss_rpn_cls: 0.0489, loss_rpn_bbox: 0.0731, loss_cls: 0.3041, acc: 90.5166, loss_bbox: 0.3313, loss_mask: 0.3365, loss: 1.0940 2024-05-29 10:55:45,358 - mmdet - INFO - Epoch [1][3450/7330] lr: 1.000e-04, eta: 16:21:44, time: 0.677, data_time: 0.068, memory: 11626, loss_rpn_cls: 0.0491, loss_rpn_bbox: 0.0704, loss_cls: 0.2995, acc: 90.5632, loss_bbox: 0.3301, loss_mask: 0.3360, loss: 1.0851 2024-05-29 10:56:19,118 - mmdet - INFO - Epoch [1][3500/7330] lr: 1.000e-04, eta: 16:20:43, time: 0.675, data_time: 0.061, memory: 11626, loss_rpn_cls: 0.0438, loss_rpn_bbox: 0.0683, loss_cls: 0.2963, acc: 90.7185, loss_bbox: 0.3244, loss_mask: 0.3295, loss: 1.0623 2024-05-29 10:56:52,771 - mmdet - INFO - Epoch [1][3550/7330] lr: 1.000e-04, eta: 16:19:40, time: 0.673, data_time: 0.055, memory: 11626, loss_rpn_cls: 0.0462, loss_rpn_bbox: 0.0687, loss_cls: 0.3008, acc: 90.5781, loss_bbox: 0.3276, loss_mask: 0.3328, loss: 1.0761 2024-05-29 10:57:26,695 - mmdet - INFO - Epoch [1][3600/7330] lr: 1.000e-04, eta: 16:18:44, time: 0.678, data_time: 0.059, memory: 11626, loss_rpn_cls: 0.0458, loss_rpn_bbox: 0.0682, loss_cls: 0.2983, acc: 90.6045, loss_bbox: 0.3324, loss_mask: 0.3379, loss: 1.0827 2024-05-29 10:58:15,590 - mmdet - INFO - Epoch [1][3650/7330] lr: 1.000e-04, eta: 16:23:35, time: 0.978, data_time: 0.054, memory: 11626, loss_rpn_cls: 0.0470, loss_rpn_bbox: 0.0745, loss_cls: 0.3072, acc: 90.3774, loss_bbox: 0.3402, loss_mask: 0.3382, loss: 1.1070 2024-05-29 10:58:49,676 - mmdet - INFO - Epoch [1][3700/7330] lr: 1.000e-04, eta: 16:22:39, time: 0.682, data_time: 0.059, memory: 11626, loss_rpn_cls: 0.0443, loss_rpn_bbox: 0.0695, loss_cls: 0.2963, acc: 90.6436, loss_bbox: 0.3220, loss_mask: 0.3254, loss: 1.0576 2024-05-29 10:59:23,236 - mmdet - INFO - Epoch [1][3750/7330] lr: 1.000e-04, eta: 16:21:32, time: 0.671, data_time: 0.057, memory: 11626, loss_rpn_cls: 0.0440, loss_rpn_bbox: 0.0715, loss_cls: 0.3016, acc: 90.5259, loss_bbox: 0.3252, loss_mask: 0.3286, loss: 1.0709 2024-05-29 10:59:57,065 - mmdet - INFO - Epoch [1][3800/7330] lr: 1.000e-04, eta: 16:20:32, time: 0.677, data_time: 0.061, memory: 11626, loss_rpn_cls: 0.0504, loss_rpn_bbox: 0.0689, loss_cls: 0.2995, acc: 90.6035, loss_bbox: 0.3269, loss_mask: 0.3374, loss: 1.0832 2024-05-29 11:00:31,196 - mmdet - INFO - Epoch [1][3850/7330] lr: 1.000e-04, eta: 16:19:39, time: 0.682, data_time: 0.078, memory: 11626, loss_rpn_cls: 0.0480, loss_rpn_bbox: 0.0710, loss_cls: 0.3036, acc: 90.2100, loss_bbox: 0.3352, loss_mask: 0.3323, loss: 1.0902 2024-05-29 11:01:05,159 - mmdet - INFO - Epoch [1][3900/7330] lr: 1.000e-04, eta: 16:18:43, time: 0.680, data_time: 0.057, memory: 11626, loss_rpn_cls: 0.0394, loss_rpn_bbox: 0.0655, loss_cls: 0.2912, acc: 90.7048, loss_bbox: 0.3240, loss_mask: 0.3220, loss: 1.0423 2024-05-29 11:01:39,124 - mmdet - INFO - Epoch [1][3950/7330] lr: 1.000e-04, eta: 16:17:48, time: 0.679, data_time: 0.075, memory: 11626, loss_rpn_cls: 0.0456, loss_rpn_bbox: 0.0740, loss_cls: 0.2947, acc: 90.6782, loss_bbox: 0.3170, loss_mask: 0.3272, loss: 1.0585 2024-05-29 11:02:14,282 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 11:02:14,283 - mmdet - INFO - Epoch [1][4000/7330] lr: 1.000e-04, eta: 16:17:18, time: 0.703, data_time: 0.072, memory: 11626, loss_rpn_cls: 0.0473, loss_rpn_bbox: 0.0695, loss_cls: 0.3064, acc: 90.2473, loss_bbox: 0.3340, loss_mask: 0.3306, loss: 1.0879 2024-05-29 11:02:47,785 - mmdet - INFO - Epoch [1][4050/7330] lr: 1.000e-04, eta: 16:16:13, time: 0.670, data_time: 0.051, memory: 11626, loss_rpn_cls: 0.0423, loss_rpn_bbox: 0.0651, loss_cls: 0.2995, acc: 90.6230, loss_bbox: 0.3239, loss_mask: 0.3262, loss: 1.0571 2024-05-29 11:03:21,736 - mmdet - INFO - Epoch [1][4100/7330] lr: 1.000e-04, eta: 16:15:19, time: 0.679, data_time: 0.057, memory: 11626, loss_rpn_cls: 0.0480, loss_rpn_bbox: 0.0686, loss_cls: 0.3032, acc: 90.6162, loss_bbox: 0.3205, loss_mask: 0.3247, loss: 1.0651 2024-05-29 11:03:56,305 - mmdet - INFO - Epoch [1][4150/7330] lr: 1.000e-04, eta: 16:14:38, time: 0.691, data_time: 0.068, memory: 11626, loss_rpn_cls: 0.0488, loss_rpn_bbox: 0.0721, loss_cls: 0.3036, acc: 90.3679, loss_bbox: 0.3360, loss_mask: 0.3341, loss: 1.0947 2024-05-29 11:04:30,210 - mmdet - INFO - Epoch [1][4200/7330] lr: 1.000e-04, eta: 16:13:43, time: 0.678, data_time: 0.063, memory: 11626, loss_rpn_cls: 0.0413, loss_rpn_bbox: 0.0690, loss_cls: 0.3033, acc: 90.5137, loss_bbox: 0.3283, loss_mask: 0.3231, loss: 1.0650 2024-05-29 11:05:04,786 - mmdet - INFO - Epoch [1][4250/7330] lr: 1.000e-04, eta: 16:13:02, time: 0.691, data_time: 0.065, memory: 11626, loss_rpn_cls: 0.0443, loss_rpn_bbox: 0.0683, loss_cls: 0.3000, acc: 90.4570, loss_bbox: 0.3278, loss_mask: 0.3231, loss: 1.0634 2024-05-29 11:05:38,618 - mmdet - INFO - Epoch [1][4300/7330] lr: 1.000e-04, eta: 16:12:07, time: 0.676, data_time: 0.056, memory: 11626, loss_rpn_cls: 0.0442, loss_rpn_bbox: 0.0676, loss_cls: 0.2964, acc: 90.7234, loss_bbox: 0.3254, loss_mask: 0.3322, loss: 1.0657 2024-05-29 11:06:13,079 - mmdet - INFO - Epoch [1][4350/7330] lr: 1.000e-04, eta: 16:11:25, time: 0.690, data_time: 0.065, memory: 11626, loss_rpn_cls: 0.0413, loss_rpn_bbox: 0.0660, loss_cls: 0.3027, acc: 90.5000, loss_bbox: 0.3285, loss_mask: 0.3225, loss: 1.0610 2024-05-29 11:06:54,996 - mmdet - INFO - Epoch [1][4400/7330] lr: 1.000e-04, eta: 16:13:04, time: 0.838, data_time: 0.062, memory: 11626, loss_rpn_cls: 0.0436, loss_rpn_bbox: 0.0711, loss_cls: 0.2974, acc: 90.5005, loss_bbox: 0.3307, loss_mask: 0.3250, loss: 1.0678 2024-05-29 11:07:37,163 - mmdet - INFO - Epoch [1][4450/7330] lr: 1.000e-04, eta: 16:14:45, time: 0.843, data_time: 0.054, memory: 11626, loss_rpn_cls: 0.0398, loss_rpn_bbox: 0.0631, loss_cls: 0.2883, acc: 90.7131, loss_bbox: 0.3217, loss_mask: 0.3248, loss: 1.0377 2024-05-29 11:08:11,190 - mmdet - INFO - Epoch [1][4500/7330] lr: 1.000e-04, eta: 16:13:51, time: 0.680, data_time: 0.061, memory: 11626, loss_rpn_cls: 0.0423, loss_rpn_bbox: 0.0676, loss_cls: 0.3019, acc: 90.3403, loss_bbox: 0.3358, loss_mask: 0.3305, loss: 1.0780 2024-05-29 11:08:46,006 - mmdet - INFO - Epoch [1][4550/7330] lr: 1.000e-04, eta: 16:13:13, time: 0.696, data_time: 0.055, memory: 11626, loss_rpn_cls: 0.0430, loss_rpn_bbox: 0.0696, loss_cls: 0.3085, acc: 90.3511, loss_bbox: 0.3309, loss_mask: 0.3266, loss: 1.0785 2024-05-29 11:09:19,864 - mmdet - INFO - Epoch [1][4600/7330] lr: 1.000e-04, eta: 16:12:17, time: 0.677, data_time: 0.048, memory: 11626, loss_rpn_cls: 0.0430, loss_rpn_bbox: 0.0674, loss_cls: 0.2841, acc: 90.7300, loss_bbox: 0.3188, loss_mask: 0.3237, loss: 1.0371 2024-05-29 11:09:54,441 - mmdet - INFO - Epoch [1][4650/7330] lr: 1.000e-04, eta: 16:11:35, time: 0.692, data_time: 0.053, memory: 11626, loss_rpn_cls: 0.0431, loss_rpn_bbox: 0.0662, loss_cls: 0.2937, acc: 90.9495, loss_bbox: 0.3151, loss_mask: 0.3240, loss: 1.0422 2024-05-29 11:10:28,650 - mmdet - INFO - Epoch [1][4700/7330] lr: 1.000e-04, eta: 16:10:46, time: 0.684, data_time: 0.076, memory: 11626, loss_rpn_cls: 0.0423, loss_rpn_bbox: 0.0662, loss_cls: 0.3006, acc: 90.4397, loss_bbox: 0.3270, loss_mask: 0.3287, loss: 1.0647 2024-05-29 11:11:03,492 - mmdet - INFO - Epoch [1][4750/7330] lr: 1.000e-04, eta: 16:10:09, time: 0.697, data_time: 0.064, memory: 11626, loss_rpn_cls: 0.0424, loss_rpn_bbox: 0.0651, loss_cls: 0.2899, acc: 90.5759, loss_bbox: 0.3213, loss_mask: 0.3263, loss: 1.0450 2024-05-29 11:11:38,901 - mmdet - INFO - Epoch [1][4800/7330] lr: 1.000e-04, eta: 16:09:41, time: 0.708, data_time: 0.069, memory: 11626, loss_rpn_cls: 0.0442, loss_rpn_bbox: 0.0739, loss_cls: 0.3044, acc: 90.3459, loss_bbox: 0.3379, loss_mask: 0.3268, loss: 1.0871 2024-05-29 11:12:13,619 - mmdet - INFO - Epoch [1][4850/7330] lr: 1.000e-04, eta: 16:09:02, time: 0.694, data_time: 0.061, memory: 11626, loss_rpn_cls: 0.0415, loss_rpn_bbox: 0.0693, loss_cls: 0.3059, acc: 90.1079, loss_bbox: 0.3393, loss_mask: 0.3266, loss: 1.0826 2024-05-29 11:12:47,182 - mmdet - INFO - Epoch [1][4900/7330] lr: 1.000e-04, eta: 16:08:03, time: 0.671, data_time: 0.052, memory: 11626, loss_rpn_cls: 0.0387, loss_rpn_bbox: 0.0656, loss_cls: 0.2842, acc: 90.9126, loss_bbox: 0.3128, loss_mask: 0.3199, loss: 1.0211 2024-05-29 11:13:21,741 - mmdet - INFO - Epoch [1][4950/7330] lr: 1.000e-04, eta: 16:07:21, time: 0.691, data_time: 0.060, memory: 11626, loss_rpn_cls: 0.0469, loss_rpn_bbox: 0.0719, loss_cls: 0.2965, acc: 90.3149, loss_bbox: 0.3318, loss_mask: 0.3325, loss: 1.0797 2024-05-29 11:13:55,828 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 11:13:55,828 - mmdet - INFO - Epoch [1][5000/7330] lr: 1.000e-04, eta: 16:06:31, time: 0.682, data_time: 0.057, memory: 11626, loss_rpn_cls: 0.0425, loss_rpn_bbox: 0.0696, loss_cls: 0.2969, acc: 90.5820, loss_bbox: 0.3245, loss_mask: 0.3231, loss: 1.0567 2024-05-29 11:14:29,801 - mmdet - INFO - Epoch [1][5050/7330] lr: 1.000e-04, eta: 16:05:40, time: 0.679, data_time: 0.066, memory: 11626, loss_rpn_cls: 0.0458, loss_rpn_bbox: 0.0682, loss_cls: 0.2915, acc: 90.4985, loss_bbox: 0.3289, loss_mask: 0.3308, loss: 1.0651 2024-05-29 11:15:04,249 - mmdet - INFO - Epoch [1][5100/7330] lr: 1.000e-04, eta: 16:04:57, time: 0.689, data_time: 0.046, memory: 11626, loss_rpn_cls: 0.0410, loss_rpn_bbox: 0.0619, loss_cls: 0.2771, acc: 91.1665, loss_bbox: 0.3124, loss_mask: 0.3202, loss: 1.0126 2024-05-29 11:15:40,374 - mmdet - INFO - Epoch [1][5150/7330] lr: 1.000e-04, eta: 16:04:41, time: 0.723, data_time: 0.063, memory: 11626, loss_rpn_cls: 0.0413, loss_rpn_bbox: 0.0645, loss_cls: 0.2738, acc: 91.1709, loss_bbox: 0.3104, loss_mask: 0.3176, loss: 1.0077 2024-05-29 11:16:29,028 - mmdet - INFO - Epoch [1][5200/7330] lr: 1.000e-04, eta: 16:07:45, time: 0.973, data_time: 0.060, memory: 11626, loss_rpn_cls: 0.0425, loss_rpn_bbox: 0.0658, loss_cls: 0.2892, acc: 90.5186, loss_bbox: 0.3248, loss_mask: 0.3205, loss: 1.0429 2024-05-29 11:17:03,361 - mmdet - INFO - Epoch [1][5250/7330] lr: 1.000e-04, eta: 16:06:58, time: 0.687, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0429, loss_rpn_bbox: 0.0663, loss_cls: 0.2770, acc: 90.9231, loss_bbox: 0.3164, loss_mask: 0.3139, loss: 1.0165 2024-05-29 11:17:37,822 - mmdet - INFO - Epoch [1][5300/7330] lr: 1.000e-04, eta: 16:06:13, time: 0.689, data_time: 0.061, memory: 11626, loss_rpn_cls: 0.0410, loss_rpn_bbox: 0.0682, loss_cls: 0.2984, acc: 90.3926, loss_bbox: 0.3274, loss_mask: 0.3243, loss: 1.0594 2024-05-29 11:18:12,390 - mmdet - INFO - Epoch [1][5350/7330] lr: 1.000e-04, eta: 16:05:30, time: 0.691, data_time: 0.065, memory: 11626, loss_rpn_cls: 0.0444, loss_rpn_bbox: 0.0691, loss_cls: 0.2933, acc: 90.4902, loss_bbox: 0.3296, loss_mask: 0.3231, loss: 1.0595 2024-05-29 11:18:47,422 - mmdet - INFO - Epoch [1][5400/7330] lr: 1.000e-04, eta: 16:04:55, time: 0.701, data_time: 0.060, memory: 11626, loss_rpn_cls: 0.0418, loss_rpn_bbox: 0.0681, loss_cls: 0.2797, acc: 90.8579, loss_bbox: 0.3137, loss_mask: 0.3167, loss: 1.0200 2024-05-29 11:19:21,943 - mmdet - INFO - Epoch [1][5450/7330] lr: 1.000e-04, eta: 16:04:12, time: 0.690, data_time: 0.067, memory: 11626, loss_rpn_cls: 0.0449, loss_rpn_bbox: 0.0685, loss_cls: 0.2921, acc: 90.5754, loss_bbox: 0.3227, loss_mask: 0.3132, loss: 1.0414 2024-05-29 11:19:56,498 - mmdet - INFO - Epoch [1][5500/7330] lr: 1.000e-04, eta: 16:03:29, time: 0.691, data_time: 0.084, memory: 11626, loss_rpn_cls: 0.0470, loss_rpn_bbox: 0.0688, loss_cls: 0.2889, acc: 90.6584, loss_bbox: 0.3195, loss_mask: 0.3174, loss: 1.0416 2024-05-29 11:20:30,943 - mmdet - INFO - Epoch [1][5550/7330] lr: 1.000e-04, eta: 16:02:45, time: 0.689, data_time: 0.070, memory: 11626, loss_rpn_cls: 0.0401, loss_rpn_bbox: 0.0661, loss_cls: 0.2800, acc: 91.0530, loss_bbox: 0.3067, loss_mask: 0.3146, loss: 1.0076 2024-05-29 11:21:06,066 - mmdet - INFO - Epoch [1][5600/7330] lr: 1.000e-04, eta: 16:02:11, time: 0.702, data_time: 0.074, memory: 11626, loss_rpn_cls: 0.0437, loss_rpn_bbox: 0.0689, loss_cls: 0.2999, acc: 90.2925, loss_bbox: 0.3322, loss_mask: 0.3179, loss: 1.0626 2024-05-29 11:21:40,207 - mmdet - INFO - Epoch [1][5650/7330] lr: 1.000e-04, eta: 16:01:23, time: 0.683, data_time: 0.054, memory: 11626, loss_rpn_cls: 0.0395, loss_rpn_bbox: 0.0634, loss_cls: 0.2906, acc: 90.7893, loss_bbox: 0.3186, loss_mask: 0.3098, loss: 1.0219 2024-05-29 11:22:13,855 - mmdet - INFO - Epoch [1][5700/7330] lr: 1.000e-04, eta: 16:00:28, time: 0.673, data_time: 0.050, memory: 11626, loss_rpn_cls: 0.0375, loss_rpn_bbox: 0.0594, loss_cls: 0.2663, acc: 91.2937, loss_bbox: 0.3036, loss_mask: 0.3096, loss: 0.9764 2024-05-29 11:22:48,250 - mmdet - INFO - Epoch [1][5750/7330] lr: 1.000e-04, eta: 15:59:43, time: 0.688, data_time: 0.061, memory: 11626, loss_rpn_cls: 0.0440, loss_rpn_bbox: 0.0665, loss_cls: 0.2942, acc: 90.2925, loss_bbox: 0.3305, loss_mask: 0.3171, loss: 1.0523 2024-05-29 11:23:22,474 - mmdet - INFO - Epoch [1][5800/7330] lr: 1.000e-04, eta: 15:58:57, time: 0.684, data_time: 0.053, memory: 11626, loss_rpn_cls: 0.0409, loss_rpn_bbox: 0.0617, loss_cls: 0.2811, acc: 90.9316, loss_bbox: 0.3137, loss_mask: 0.3102, loss: 1.0076 2024-05-29 11:23:56,236 - mmdet - INFO - Epoch [1][5850/7330] lr: 1.000e-04, eta: 15:58:04, time: 0.675, data_time: 0.062, memory: 11626, loss_rpn_cls: 0.0415, loss_rpn_bbox: 0.0656, loss_cls: 0.2786, acc: 90.9133, loss_bbox: 0.3117, loss_mask: 0.3102, loss: 1.0076 2024-05-29 11:24:29,850 - mmdet - INFO - Epoch [1][5900/7330] lr: 1.000e-04, eta: 15:57:10, time: 0.672, data_time: 0.056, memory: 11626, loss_rpn_cls: 0.0400, loss_rpn_bbox: 0.0614, loss_cls: 0.2667, acc: 91.3806, loss_bbox: 0.3024, loss_mask: 0.3024, loss: 0.9729 2024-05-29 11:25:20,932 - mmdet - INFO - Epoch [1][5950/7330] lr: 1.000e-04, eta: 16:00:17, time: 1.022, data_time: 0.062, memory: 11626, loss_rpn_cls: 0.0403, loss_rpn_bbox: 0.0657, loss_cls: 0.2734, acc: 91.1616, loss_bbox: 0.3002, loss_mask: 0.3046, loss: 0.9841 2024-05-29 11:25:54,598 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 11:25:54,598 - mmdet - INFO - Epoch [1][6000/7330] lr: 1.000e-04, eta: 15:59:22, time: 0.673, data_time: 0.046, memory: 11626, loss_rpn_cls: 0.0412, loss_rpn_bbox: 0.0637, loss_cls: 0.2654, acc: 91.5037, loss_bbox: 0.2946, loss_mask: 0.3048, loss: 0.9697 2024-05-29 11:26:29,553 - mmdet - INFO - Epoch [1][6050/7330] lr: 1.000e-04, eta: 15:58:44, time: 0.699, data_time: 0.065, memory: 11626, loss_rpn_cls: 0.0392, loss_rpn_bbox: 0.0668, loss_cls: 0.2874, acc: 90.7983, loss_bbox: 0.3165, loss_mask: 0.3103, loss: 1.0203 2024-05-29 11:27:03,992 - mmdet - INFO - Epoch [1][6100/7330] lr: 1.000e-04, eta: 15:58:00, time: 0.689, data_time: 0.064, memory: 11626, loss_rpn_cls: 0.0436, loss_rpn_bbox: 0.0685, loss_cls: 0.2772, acc: 91.1929, loss_bbox: 0.3050, loss_mask: 0.3171, loss: 1.0114 2024-05-29 11:27:37,852 - mmdet - INFO - Epoch [1][6150/7330] lr: 1.000e-04, eta: 15:57:09, time: 0.677, data_time: 0.061, memory: 11626, loss_rpn_cls: 0.0383, loss_rpn_bbox: 0.0624, loss_cls: 0.2688, acc: 91.2336, loss_bbox: 0.3036, loss_mask: 0.3025, loss: 0.9757 2024-05-29 11:28:11,251 - mmdet - INFO - Epoch [1][6200/7330] lr: 1.000e-04, eta: 15:56:11, time: 0.668, data_time: 0.070, memory: 11626, loss_rpn_cls: 0.0382, loss_rpn_bbox: 0.0639, loss_cls: 0.2810, acc: 90.8789, loss_bbox: 0.3161, loss_mask: 0.3089, loss: 1.0082 2024-05-29 11:28:45,062 - mmdet - INFO - Epoch [1][6250/7330] lr: 1.000e-04, eta: 15:55:19, time: 0.676, data_time: 0.060, memory: 11626, loss_rpn_cls: 0.0418, loss_rpn_bbox: 0.0632, loss_cls: 0.2722, acc: 91.3779, loss_bbox: 0.3004, loss_mask: 0.3066, loss: 0.9842 2024-05-29 11:29:18,591 - mmdet - INFO - Epoch [1][6300/7330] lr: 1.000e-04, eta: 15:54:24, time: 0.671, data_time: 0.060, memory: 11626, loss_rpn_cls: 0.0386, loss_rpn_bbox: 0.0607, loss_cls: 0.2797, acc: 90.9390, loss_bbox: 0.3136, loss_mask: 0.3109, loss: 1.0035 2024-05-29 11:29:52,488 - mmdet - INFO - Epoch [1][6350/7330] lr: 1.000e-04, eta: 15:53:34, time: 0.678, data_time: 0.053, memory: 11626, loss_rpn_cls: 0.0399, loss_rpn_bbox: 0.0627, loss_cls: 0.2711, acc: 91.2317, loss_bbox: 0.3016, loss_mask: 0.3055, loss: 0.9808 2024-05-29 11:30:26,464 - mmdet - INFO - Epoch [1][6400/7330] lr: 1.000e-04, eta: 15:52:45, time: 0.680, data_time: 0.062, memory: 11626, loss_rpn_cls: 0.0406, loss_rpn_bbox: 0.0664, loss_cls: 0.2850, acc: 90.7095, loss_bbox: 0.3245, loss_mask: 0.3078, loss: 1.0242 2024-05-29 11:31:00,938 - mmdet - INFO - Epoch [1][6450/7330] lr: 1.000e-04, eta: 15:52:03, time: 0.689, data_time: 0.061, memory: 11626, loss_rpn_cls: 0.0424, loss_rpn_bbox: 0.0643, loss_cls: 0.2827, acc: 90.8213, loss_bbox: 0.3181, loss_mask: 0.3163, loss: 1.0239 2024-05-29 11:31:35,286 - mmdet - INFO - Epoch [1][6500/7330] lr: 1.000e-04, eta: 15:51:19, time: 0.687, data_time: 0.043, memory: 11626, loss_rpn_cls: 0.0381, loss_rpn_bbox: 0.0626, loss_cls: 0.2661, acc: 91.2546, loss_bbox: 0.2988, loss_mask: 0.3021, loss: 0.9677 2024-05-29 11:32:09,150 - mmdet - INFO - Epoch [1][6550/7330] lr: 1.000e-04, eta: 15:50:30, time: 0.677, data_time: 0.057, memory: 11626, loss_rpn_cls: 0.0381, loss_rpn_bbox: 0.0636, loss_cls: 0.2650, acc: 91.3655, loss_bbox: 0.3030, loss_mask: 0.3026, loss: 0.9723 2024-05-29 11:32:42,677 - mmdet - INFO - Epoch [1][6600/7330] lr: 1.000e-04, eta: 15:49:36, time: 0.671, data_time: 0.055, memory: 11626, loss_rpn_cls: 0.0352, loss_rpn_bbox: 0.0612, loss_cls: 0.2701, acc: 91.1716, loss_bbox: 0.3039, loss_mask: 0.3077, loss: 0.9782 2024-05-29 11:33:17,184 - mmdet - INFO - Epoch [1][6650/7330] lr: 1.000e-04, eta: 15:48:55, time: 0.690, data_time: 0.058, memory: 11626, loss_rpn_cls: 0.0410, loss_rpn_bbox: 0.0685, loss_cls: 0.2852, acc: 90.7422, loss_bbox: 0.3137, loss_mask: 0.3101, loss: 1.0184 2024-05-29 11:34:06,820 - mmdet - INFO - Epoch [1][6700/7330] lr: 1.000e-04, eta: 15:51:17, time: 0.993, data_time: 0.068, memory: 11626, loss_rpn_cls: 0.0425, loss_rpn_bbox: 0.0663, loss_cls: 0.2799, acc: 90.8528, loss_bbox: 0.3139, loss_mask: 0.3102, loss: 1.0128 2024-05-29 11:34:41,026 - mmdet - INFO - Epoch [1][6750/7330] lr: 1.000e-04, eta: 15:50:31, time: 0.684, data_time: 0.067, memory: 11626, loss_rpn_cls: 0.0392, loss_rpn_bbox: 0.0639, loss_cls: 0.2768, acc: 91.0576, loss_bbox: 0.3057, loss_mask: 0.3058, loss: 0.9913 2024-05-29 11:35:15,060 - mmdet - INFO - Epoch [1][6800/7330] lr: 1.000e-04, eta: 15:49:43, time: 0.681, data_time: 0.065, memory: 11626, loss_rpn_cls: 0.0419, loss_rpn_bbox: 0.0625, loss_cls: 0.2762, acc: 90.9580, loss_bbox: 0.3063, loss_mask: 0.2976, loss: 0.9846 2024-05-29 11:35:48,827 - mmdet - INFO - Epoch [1][6850/7330] lr: 1.000e-04, eta: 15:48:52, time: 0.675, data_time: 0.055, memory: 11626, loss_rpn_cls: 0.0374, loss_rpn_bbox: 0.0627, loss_cls: 0.2707, acc: 90.9551, loss_bbox: 0.3097, loss_mask: 0.3054, loss: 0.9859 2024-05-29 11:36:22,793 - mmdet - INFO - Epoch [1][6900/7330] lr: 1.000e-04, eta: 15:48:04, time: 0.679, data_time: 0.066, memory: 11626, loss_rpn_cls: 0.0399, loss_rpn_bbox: 0.0637, loss_cls: 0.2742, acc: 91.0493, loss_bbox: 0.3044, loss_mask: 0.3019, loss: 0.9841 2024-05-29 11:36:57,487 - mmdet - INFO - Epoch [1][6950/7330] lr: 1.000e-04, eta: 15:47:24, time: 0.694, data_time: 0.058, memory: 11626, loss_rpn_cls: 0.0405, loss_rpn_bbox: 0.0658, loss_cls: 0.2857, acc: 90.7109, loss_bbox: 0.3164, loss_mask: 0.3102, loss: 1.0185 2024-05-29 11:37:31,717 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 11:37:31,717 - mmdet - INFO - Epoch [1][7000/7330] lr: 1.000e-04, eta: 15:46:39, time: 0.685, data_time: 0.079, memory: 11626, loss_rpn_cls: 0.0419, loss_rpn_bbox: 0.0641, loss_cls: 0.2896, acc: 90.5393, loss_bbox: 0.3218, loss_mask: 0.3120, loss: 1.0294 2024-05-29 11:38:05,217 - mmdet - INFO - Epoch [1][7050/7330] lr: 1.000e-04, eta: 15:45:46, time: 0.670, data_time: 0.065, memory: 11626, loss_rpn_cls: 0.0360, loss_rpn_bbox: 0.0609, loss_cls: 0.2740, acc: 91.0300, loss_bbox: 0.3106, loss_mask: 0.3070, loss: 0.9885 2024-05-29 11:38:39,606 - mmdet - INFO - Epoch [1][7100/7330] lr: 1.000e-04, eta: 15:45:03, time: 0.688, data_time: 0.084, memory: 11626, loss_rpn_cls: 0.0365, loss_rpn_bbox: 0.0614, loss_cls: 0.2712, acc: 91.2671, loss_bbox: 0.2983, loss_mask: 0.2917, loss: 0.9590 2024-05-29 11:39:14,172 - mmdet - INFO - Epoch [1][7150/7330] lr: 1.000e-04, eta: 15:44:23, time: 0.691, data_time: 0.057, memory: 11626, loss_rpn_cls: 0.0404, loss_rpn_bbox: 0.0665, loss_cls: 0.2883, acc: 90.8032, loss_bbox: 0.3142, loss_mask: 0.3042, loss: 1.0137 2024-05-29 11:39:48,064 - mmdet - INFO - Epoch [1][7200/7330] lr: 1.000e-04, eta: 15:43:34, time: 0.678, data_time: 0.070, memory: 11626, loss_rpn_cls: 0.0369, loss_rpn_bbox: 0.0630, loss_cls: 0.2752, acc: 90.9028, loss_bbox: 0.3207, loss_mask: 0.3054, loss: 1.0013 2024-05-29 11:40:22,511 - mmdet - INFO - Epoch [1][7250/7330] lr: 1.000e-04, eta: 15:42:53, time: 0.689, data_time: 0.063, memory: 11626, loss_rpn_cls: 0.0372, loss_rpn_bbox: 0.0639, loss_cls: 0.2829, acc: 90.7671, loss_bbox: 0.3165, loss_mask: 0.3046, loss: 1.0050 2024-05-29 11:40:56,720 - mmdet - INFO - Epoch [1][7300/7330] lr: 1.000e-04, eta: 15:42:08, time: 0.684, data_time: 0.067, memory: 11626, loss_rpn_cls: 0.0363, loss_rpn_bbox: 0.0642, loss_cls: 0.2769, acc: 90.9180, loss_bbox: 0.3078, loss_mask: 0.2981, loss: 0.9833 2024-05-29 11:41:18,102 - mmdet - INFO - Saving checkpoint at 1 epochs 2024-05-29 11:43:19,457 - mmdet - INFO - Evaluating bbox... 2024-05-29 11:43:51,561 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.283 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.524 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.281 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.155 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.318 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.399 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.424 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.424 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.424 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.246 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.464 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.571 2024-05-29 11:43:51,562 - mmdet - INFO - Evaluating segm... 2024-05-29 11:44:27,021 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.279 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.489 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.283 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.110 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.308 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.456 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.407 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.407 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.407 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.205 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.453 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.587 2024-05-29 11:44:27,594 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 11:44:27,594 - mmdet - INFO - Epoch(val) [1][625] bbox_mAP: 0.2830, bbox_mAP_50: 0.5240, bbox_mAP_75: 0.2810, bbox_mAP_s: 0.1550, bbox_mAP_m: 0.3180, bbox_mAP_l: 0.3990, bbox_mAP_copypaste: 0.283 0.524 0.281 0.155 0.318 0.399, segm_mAP: 0.2790, segm_mAP_50: 0.4890, segm_mAP_75: 0.2830, segm_mAP_s: 0.1100, segm_mAP_m: 0.3080, segm_mAP_l: 0.4560, segm_mAP_copypaste: 0.279 0.489 0.283 0.110 0.308 0.456 2024-05-29 11:45:06,962 - mmdet - INFO - Epoch [2][50/7330] lr: 1.000e-04, eta: 15:38:10, time: 0.787, data_time: 0.113, memory: 11626, loss_rpn_cls: 0.0353, loss_rpn_bbox: 0.0591, loss_cls: 0.2625, acc: 91.2314, loss_bbox: 0.3005, loss_mask: 0.2909, loss: 0.9481 2024-05-29 11:45:39,763 - mmdet - INFO - Epoch [2][100/7330] lr: 1.000e-04, eta: 15:37:12, time: 0.656, data_time: 0.049, memory: 11626, loss_rpn_cls: 0.0370, loss_rpn_bbox: 0.0626, loss_cls: 0.2616, acc: 91.2793, loss_bbox: 0.2993, loss_mask: 0.2944, loss: 0.9549 2024-05-29 11:46:13,498 - mmdet - INFO - Epoch [2][150/7330] lr: 1.000e-04, eta: 15:36:24, time: 0.675, data_time: 0.050, memory: 11626, loss_rpn_cls: 0.0392, loss_rpn_bbox: 0.0641, loss_cls: 0.2664, acc: 91.1470, loss_bbox: 0.3046, loss_mask: 0.2953, loss: 0.9696 2024-05-29 11:46:47,108 - mmdet - INFO - Epoch [2][200/7330] lr: 1.000e-04, eta: 15:35:35, time: 0.672, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0356, loss_rpn_bbox: 0.0623, loss_cls: 0.2631, acc: 91.2307, loss_bbox: 0.3059, loss_mask: 0.2975, loss: 0.9644 2024-05-29 11:47:20,784 - mmdet - INFO - Epoch [2][250/7330] lr: 1.000e-04, eta: 15:34:47, time: 0.674, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0361, loss_rpn_bbox: 0.0616, loss_cls: 0.2660, acc: 91.1580, loss_bbox: 0.3018, loss_mask: 0.2988, loss: 0.9643 2024-05-29 11:47:54,133 - mmdet - INFO - Epoch [2][300/7330] lr: 1.000e-04, eta: 15:33:56, time: 0.667, data_time: 0.048, memory: 11626, loss_rpn_cls: 0.0384, loss_rpn_bbox: 0.0638, loss_cls: 0.2613, acc: 91.2651, loss_bbox: 0.3005, loss_mask: 0.2930, loss: 0.9571 2024-05-29 11:48:26,794 - mmdet - INFO - Epoch [2][350/7330] lr: 1.000e-04, eta: 15:32:58, time: 0.653, data_time: 0.056, memory: 11626, loss_rpn_cls: 0.0367, loss_rpn_bbox: 0.0657, loss_cls: 0.2629, acc: 91.3308, loss_bbox: 0.3035, loss_mask: 0.2964, loss: 0.9652 2024-05-29 11:48:59,892 - mmdet - INFO - Epoch [2][400/7330] lr: 1.000e-04, eta: 15:32:05, time: 0.662, data_time: 0.055, memory: 11626, loss_rpn_cls: 0.0353, loss_rpn_bbox: 0.0618, loss_cls: 0.2715, acc: 91.0212, loss_bbox: 0.3095, loss_mask: 0.2975, loss: 0.9756 2024-05-29 11:49:33,599 - mmdet - INFO - Epoch [2][450/7330] lr: 1.000e-04, eta: 15:31:18, time: 0.674, data_time: 0.049, memory: 11626, loss_rpn_cls: 0.0368, loss_rpn_bbox: 0.0624, loss_cls: 0.2705, acc: 90.8472, loss_bbox: 0.3142, loss_mask: 0.3011, loss: 0.9850 2024-05-29 11:50:06,921 - mmdet - INFO - Epoch [2][500/7330] lr: 1.000e-04, eta: 15:30:28, time: 0.666, data_time: 0.046, memory: 11626, loss_rpn_cls: 0.0379, loss_rpn_bbox: 0.0632, loss_cls: 0.2696, acc: 90.9080, loss_bbox: 0.3132, loss_mask: 0.3000, loss: 0.9839 2024-05-29 11:50:40,677 - mmdet - INFO - Epoch [2][550/7330] lr: 1.000e-04, eta: 15:29:42, time: 0.675, data_time: 0.056, memory: 11626, loss_rpn_cls: 0.0351, loss_rpn_bbox: 0.0650, loss_cls: 0.2512, acc: 91.5481, loss_bbox: 0.2936, loss_mask: 0.2901, loss: 0.9351 2024-05-29 11:51:13,583 - mmdet - INFO - Epoch [2][600/7330] lr: 1.000e-04, eta: 15:28:48, time: 0.658, data_time: 0.041, memory: 11626, loss_rpn_cls: 0.0333, loss_rpn_bbox: 0.0561, loss_cls: 0.2521, acc: 91.4836, loss_bbox: 0.2972, loss_mask: 0.2944, loss: 0.9331 2024-05-29 11:51:46,926 - mmdet - INFO - Epoch [2][650/7330] lr: 1.000e-04, eta: 15:27:58, time: 0.667, data_time: 0.048, memory: 11626, loss_rpn_cls: 0.0370, loss_rpn_bbox: 0.0629, loss_cls: 0.2739, acc: 90.8401, loss_bbox: 0.3130, loss_mask: 0.2946, loss: 0.9814 2024-05-29 11:52:20,211 - mmdet - INFO - Epoch [2][700/7330] lr: 1.000e-04, eta: 15:27:08, time: 0.666, data_time: 0.049, memory: 11626, loss_rpn_cls: 0.0369, loss_rpn_bbox: 0.0642, loss_cls: 0.2619, acc: 91.2043, loss_bbox: 0.3063, loss_mask: 0.2966, loss: 0.9659 2024-05-29 11:52:53,298 - mmdet - INFO - Epoch [2][750/7330] lr: 1.000e-04, eta: 15:26:17, time: 0.662, data_time: 0.050, memory: 11626, loss_rpn_cls: 0.0357, loss_rpn_bbox: 0.0627, loss_cls: 0.2474, acc: 91.7327, loss_bbox: 0.2916, loss_mask: 0.2855, loss: 0.9228 2024-05-29 11:53:26,158 - mmdet - INFO - Epoch [2][800/7330] lr: 1.000e-04, eta: 15:25:23, time: 0.657, data_time: 0.039, memory: 11626, loss_rpn_cls: 0.0342, loss_rpn_bbox: 0.0574, loss_cls: 0.2539, acc: 91.4746, loss_bbox: 0.2978, loss_mask: 0.2951, loss: 0.9384 2024-05-29 11:53:59,382 - mmdet - INFO - Epoch [2][850/7330] lr: 1.000e-04, eta: 15:24:33, time: 0.664, data_time: 0.050, memory: 11626, loss_rpn_cls: 0.0360, loss_rpn_bbox: 0.0604, loss_cls: 0.2605, acc: 91.3560, loss_bbox: 0.2974, loss_mask: 0.3009, loss: 0.9553 2024-05-29 11:54:34,899 - mmdet - INFO - Epoch [2][900/7330] lr: 1.000e-04, eta: 15:24:05, time: 0.710, data_time: 0.049, memory: 11626, loss_rpn_cls: 0.0358, loss_rpn_bbox: 0.0620, loss_cls: 0.2608, acc: 91.1956, loss_bbox: 0.3024, loss_mask: 0.2909, loss: 0.9520 2024-05-29 11:55:14,369 - mmdet - INFO - Epoch [2][950/7330] lr: 1.000e-04, eta: 15:24:16, time: 0.789, data_time: 0.042, memory: 11626, loss_rpn_cls: 0.0350, loss_rpn_bbox: 0.0614, loss_cls: 0.2658, acc: 91.2112, loss_bbox: 0.3072, loss_mask: 0.2956, loss: 0.9651 2024-05-29 11:55:47,728 - mmdet - INFO - Epoch [2][1000/7330] lr: 1.000e-04, eta: 15:23:27, time: 0.667, data_time: 0.053, memory: 11626, loss_rpn_cls: 0.0370, loss_rpn_bbox: 0.0613, loss_cls: 0.2571, acc: 91.5117, loss_bbox: 0.2949, loss_mask: 0.2888, loss: 0.9392 2024-05-29 11:56:20,549 - mmdet - INFO - Epoch [2][1050/7330] lr: 1.000e-04, eta: 15:22:34, time: 0.656, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0337, loss_rpn_bbox: 0.0579, loss_cls: 0.2587, acc: 91.3135, loss_bbox: 0.2999, loss_mask: 0.2958, loss: 0.9461 2024-05-29 11:56:54,097 - mmdet - INFO - Epoch [2][1100/7330] lr: 1.000e-04, eta: 15:21:47, time: 0.671, data_time: 0.044, memory: 11626, loss_rpn_cls: 0.0346, loss_rpn_bbox: 0.0613, loss_cls: 0.2581, acc: 91.2429, loss_bbox: 0.3011, loss_mask: 0.2966, loss: 0.9517 2024-05-29 11:57:27,284 - mmdet - INFO - Epoch [2][1150/7330] lr: 1.000e-04, eta: 15:20:58, time: 0.664, data_time: 0.058, memory: 11626, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0610, loss_cls: 0.2615, acc: 91.0654, loss_bbox: 0.3071, loss_mask: 0.2948, loss: 0.9583 2024-05-29 11:58:00,489 - mmdet - INFO - Epoch [2][1200/7330] lr: 1.000e-04, eta: 15:20:09, time: 0.664, data_time: 0.055, memory: 11626, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0579, loss_cls: 0.2518, acc: 91.7278, loss_bbox: 0.2836, loss_mask: 0.2897, loss: 0.9167 2024-05-29 11:58:36,223 - mmdet - INFO - Epoch [2][1250/7330] lr: 1.000e-04, eta: 15:19:43, time: 0.714, data_time: 0.057, memory: 11626, loss_rpn_cls: 0.0328, loss_rpn_bbox: 0.0615, loss_cls: 0.2646, acc: 91.1819, loss_bbox: 0.3085, loss_mask: 0.2901, loss: 0.9576 2024-05-29 11:59:11,909 - mmdet - INFO - Epoch [2][1300/7330] lr: 1.000e-04, eta: 15:19:17, time: 0.714, data_time: 0.065, memory: 11626, loss_rpn_cls: 0.0368, loss_rpn_bbox: 0.0608, loss_cls: 0.2602, acc: 91.2693, loss_bbox: 0.3061, loss_mask: 0.2927, loss: 0.9566 2024-05-29 11:59:49,818 - mmdet - INFO - Epoch [2][1350/7330] lr: 1.000e-04, eta: 15:19:11, time: 0.758, data_time: 0.052, memory: 11626, loss_rpn_cls: 0.0313, loss_rpn_bbox: 0.0575, loss_cls: 0.2497, acc: 91.6902, loss_bbox: 0.2899, loss_mask: 0.2922, loss: 0.9205 2024-05-29 12:00:22,911 - mmdet - INFO - Epoch [2][1400/7330] lr: 1.000e-04, eta: 15:18:20, time: 0.662, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0353, loss_rpn_bbox: 0.0550, loss_cls: 0.2562, acc: 91.6240, loss_bbox: 0.2897, loss_mask: 0.2882, loss: 0.9244 2024-05-29 12:00:56,086 - mmdet - INFO - Epoch [2][1450/7330] lr: 1.000e-04, eta: 15:17:31, time: 0.663, data_time: 0.049, memory: 11626, loss_rpn_cls: 0.0361, loss_rpn_bbox: 0.0618, loss_cls: 0.2695, acc: 91.1736, loss_bbox: 0.3021, loss_mask: 0.3010, loss: 0.9706 2024-05-29 12:01:29,212 - mmdet - INFO - Epoch [2][1500/7330] lr: 1.000e-04, eta: 15:16:42, time: 0.663, data_time: 0.050, memory: 11626, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0592, loss_cls: 0.2629, acc: 91.1655, loss_bbox: 0.3036, loss_mask: 0.2920, loss: 0.9523 2024-05-29 12:02:01,907 - mmdet - INFO - Epoch [2][1550/7330] lr: 1.000e-04, eta: 15:15:49, time: 0.654, data_time: 0.050, memory: 11626, loss_rpn_cls: 0.0334, loss_rpn_bbox: 0.0602, loss_cls: 0.2568, acc: 91.5156, loss_bbox: 0.2938, loss_mask: 0.2939, loss: 0.9381 2024-05-29 12:02:34,874 - mmdet - INFO - Epoch [2][1600/7330] lr: 1.000e-04, eta: 15:14:58, time: 0.659, data_time: 0.046, memory: 11626, loss_rpn_cls: 0.0351, loss_rpn_bbox: 0.0645, loss_cls: 0.2563, acc: 91.2402, loss_bbox: 0.3012, loss_mask: 0.3007, loss: 0.9578 2024-05-29 12:03:08,984 - mmdet - INFO - Epoch [2][1650/7330] lr: 1.000e-04, eta: 15:14:18, time: 0.682, data_time: 0.046, memory: 11626, loss_rpn_cls: 0.0346, loss_rpn_bbox: 0.0643, loss_cls: 0.2580, acc: 91.3679, loss_bbox: 0.3028, loss_mask: 0.2924, loss: 0.9521 2024-05-29 12:03:42,228 - mmdet - INFO - Epoch [2][1700/7330] lr: 1.000e-04, eta: 15:13:30, time: 0.665, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0312, loss_rpn_bbox: 0.0595, loss_cls: 0.2572, acc: 91.4683, loss_bbox: 0.2980, loss_mask: 0.2919, loss: 0.9379 2024-05-29 12:04:15,194 - mmdet - INFO - Epoch [2][1750/7330] lr: 1.000e-04, eta: 15:12:40, time: 0.659, data_time: 0.053, memory: 11626, loss_rpn_cls: 0.0328, loss_rpn_bbox: 0.0591, loss_cls: 0.2612, acc: 91.1450, loss_bbox: 0.3074, loss_mask: 0.2895, loss: 0.9501 2024-05-29 12:04:48,981 - mmdet - INFO - Epoch [2][1800/7330] lr: 1.000e-04, eta: 15:11:58, time: 0.676, data_time: 0.041, memory: 11626, loss_rpn_cls: 0.0305, loss_rpn_bbox: 0.0563, loss_cls: 0.2428, acc: 91.8875, loss_bbox: 0.2828, loss_mask: 0.2846, loss: 0.8969 2024-05-29 12:05:31,287 - mmdet - INFO - Epoch [2][1850/7330] lr: 1.000e-04, eta: 15:12:28, time: 0.846, data_time: 0.052, memory: 11626, loss_rpn_cls: 0.0339, loss_rpn_bbox: 0.0599, loss_cls: 0.2500, acc: 91.6318, loss_bbox: 0.2892, loss_mask: 0.2887, loss: 0.9217 2024-05-29 12:06:04,367 - mmdet - INFO - Epoch [2][1900/7330] lr: 1.000e-04, eta: 15:11:39, time: 0.662, data_time: 0.055, memory: 11626, loss_rpn_cls: 0.0396, loss_rpn_bbox: 0.0655, loss_cls: 0.2652, acc: 90.9844, loss_bbox: 0.3089, loss_mask: 0.2963, loss: 0.9756 2024-05-29 12:06:37,503 - mmdet - INFO - Epoch [2][1950/7330] lr: 1.000e-04, eta: 15:10:51, time: 0.663, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0344, loss_rpn_bbox: 0.0581, loss_cls: 0.2452, acc: 91.7522, loss_bbox: 0.2901, loss_mask: 0.2858, loss: 0.9136 2024-05-29 12:07:11,306 - mmdet - INFO - Epoch [2][2000/7330] lr: 1.000e-04, eta: 15:10:08, time: 0.676, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0334, loss_rpn_bbox: 0.0603, loss_cls: 0.2462, acc: 91.6975, loss_bbox: 0.2864, loss_mask: 0.2863, loss: 0.9126 2024-05-29 12:07:44,198 - mmdet - INFO - Epoch [2][2050/7330] lr: 1.000e-04, eta: 15:09:18, time: 0.658, data_time: 0.046, memory: 11626, loss_rpn_cls: 0.0337, loss_rpn_bbox: 0.0589, loss_cls: 0.2592, acc: 91.2170, loss_bbox: 0.2951, loss_mask: 0.2792, loss: 0.9261 2024-05-29 12:08:18,005 - mmdet - INFO - Epoch [2][2100/7330] lr: 1.000e-04, eta: 15:08:36, time: 0.676, data_time: 0.057, memory: 11626, loss_rpn_cls: 0.0369, loss_rpn_bbox: 0.0609, loss_cls: 0.2585, acc: 91.3689, loss_bbox: 0.2954, loss_mask: 0.2901, loss: 0.9418 2024-05-29 12:08:54,923 - mmdet - INFO - Epoch [2][2150/7330] lr: 1.000e-04, eta: 15:08:19, time: 0.738, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0319, loss_rpn_bbox: 0.0570, loss_cls: 0.2582, acc: 91.4900, loss_bbox: 0.2876, loss_mask: 0.2867, loss: 0.9214 2024-05-29 12:09:29,722 - mmdet - INFO - Epoch [2][2200/7330] lr: 1.000e-04, eta: 15:07:45, time: 0.696, data_time: 0.050, memory: 11626, loss_rpn_cls: 0.0348, loss_rpn_bbox: 0.0588, loss_cls: 0.2485, acc: 91.8162, loss_bbox: 0.2822, loss_mask: 0.2904, loss: 0.9147 2024-05-29 12:10:07,073 - mmdet - INFO - Epoch [2][2250/7330] lr: 1.000e-04, eta: 15:07:32, time: 0.747, data_time: 0.049, memory: 11626, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0572, loss_cls: 0.2524, acc: 91.5283, loss_bbox: 0.2863, loss_mask: 0.2886, loss: 0.9180 2024-05-29 12:10:39,602 - mmdet - INFO - Epoch [2][2300/7330] lr: 1.000e-04, eta: 15:06:39, time: 0.651, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0342, loss_rpn_bbox: 0.0567, loss_cls: 0.2471, acc: 91.9268, loss_bbox: 0.2780, loss_mask: 0.2835, loss: 0.8996 2024-05-29 12:11:13,280 - mmdet - INFO - Epoch [2][2350/7330] lr: 1.000e-04, eta: 15:05:56, time: 0.673, data_time: 0.053, memory: 11626, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0569, loss_cls: 0.2550, acc: 91.5698, loss_bbox: 0.2842, loss_mask: 0.2839, loss: 0.9120 2024-05-29 12:11:46,967 - mmdet - INFO - Epoch [2][2400/7330] lr: 1.000e-04, eta: 15:05:13, time: 0.674, data_time: 0.045, memory: 11626, loss_rpn_cls: 0.0349, loss_rpn_bbox: 0.0610, loss_cls: 0.2534, acc: 91.3733, loss_bbox: 0.2996, loss_mask: 0.2869, loss: 0.9358 2024-05-29 12:12:19,862 - mmdet - INFO - Epoch [2][2450/7330] lr: 1.000e-04, eta: 15:04:24, time: 0.658, data_time: 0.044, memory: 11626, loss_rpn_cls: 0.0333, loss_rpn_bbox: 0.0567, loss_cls: 0.2407, acc: 92.0132, loss_bbox: 0.2815, loss_mask: 0.2881, loss: 0.9003 2024-05-29 12:12:53,516 - mmdet - INFO - Epoch [2][2500/7330] lr: 1.000e-04, eta: 15:03:41, time: 0.673, data_time: 0.041, memory: 11626, loss_rpn_cls: 0.0339, loss_rpn_bbox: 0.0618, loss_cls: 0.2628, acc: 91.2871, loss_bbox: 0.2971, loss_mask: 0.2902, loss: 0.9458 2024-05-29 12:13:26,516 - mmdet - INFO - Epoch [2][2550/7330] lr: 1.000e-04, eta: 15:02:53, time: 0.660, data_time: 0.057, memory: 11626, loss_rpn_cls: 0.0327, loss_rpn_bbox: 0.0582, loss_cls: 0.2512, acc: 91.6746, loss_bbox: 0.2906, loss_mask: 0.2823, loss: 0.9150 2024-05-29 12:13:59,398 - mmdet - INFO - Epoch [2][2600/7330] lr: 1.000e-04, eta: 15:02:04, time: 0.658, data_time: 0.051, memory: 11626, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0584, loss_cls: 0.2450, acc: 91.6016, loss_bbox: 0.2895, loss_mask: 0.2832, loss: 0.9063 2024-05-29 12:14:32,412 - mmdet - INFO - Epoch [2][2650/7330] lr: 1.000e-04, eta: 15:01:16, time: 0.660, data_time: 0.043, memory: 11626, loss_rpn_cls: 0.0322, loss_rpn_bbox: 0.0582, loss_cls: 0.2501, acc: 91.5847, loss_bbox: 0.2898, loss_mask: 0.2865, loss: 0.9169 2024-05-29 12:15:05,638 - mmdet - INFO - Epoch [2][2700/7330] lr: 1.000e-04, eta: 15:00:30, time: 0.665, data_time: 0.041, memory: 11626, loss_rpn_cls: 0.0321, loss_rpn_bbox: 0.0572, loss_cls: 0.2493, acc: 91.6680, loss_bbox: 0.2879, loss_mask: 0.2828, loss: 0.9093 2024-05-29 12:15:45,513 - mmdet - INFO - Epoch [2][2750/7330] lr: 1.000e-04, eta: 15:00:36, time: 0.797, data_time: 0.062, memory: 11626, loss_rpn_cls: 0.0324, loss_rpn_bbox: 0.0568, loss_cls: 0.2540, acc: 91.6921, loss_bbox: 0.2876, loss_mask: 0.2929, loss: 0.9237 2024-05-29 12:16:20,464 - mmdet - INFO - Epoch [2][2800/7330] lr: 1.000e-04, eta: 15:00:03, time: 0.699, data_time: 0.052, memory: 11626, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0611, loss_cls: 0.2532, acc: 91.5427, loss_bbox: 0.2911, loss_mask: 0.2875, loss: 0.9249 2024-05-29 12:16:54,279 - mmdet - INFO - Epoch [2][2850/7330] lr: 1.000e-04, eta: 14:59:22, time: 0.676, data_time: 0.060, memory: 11626, loss_rpn_cls: 0.0324, loss_rpn_bbox: 0.0593, loss_cls: 0.2488, acc: 91.6682, loss_bbox: 0.2865, loss_mask: 0.2883, loss: 0.9153 2024-05-29 12:17:27,670 - mmdet - INFO - Epoch [2][2900/7330] lr: 1.000e-04, eta: 14:58:37, time: 0.668, data_time: 0.051, memory: 11626, loss_rpn_cls: 0.0354, loss_rpn_bbox: 0.0582, loss_cls: 0.2578, acc: 91.4143, loss_bbox: 0.2946, loss_mask: 0.2900, loss: 0.9360 2024-05-29 12:18:01,073 - mmdet - INFO - Epoch [2][2950/7330] lr: 1.000e-04, eta: 14:57:53, time: 0.668, data_time: 0.035, memory: 11626, loss_rpn_cls: 0.0349, loss_rpn_bbox: 0.0588, loss_cls: 0.2540, acc: 91.4167, loss_bbox: 0.2916, loss_mask: 0.2885, loss: 0.9278 2024-05-29 12:18:34,458 - mmdet - INFO - Epoch [2][3000/7330] lr: 1.000e-04, eta: 14:57:08, time: 0.667, data_time: 0.051, memory: 11626, loss_rpn_cls: 0.0334, loss_rpn_bbox: 0.0594, loss_cls: 0.2527, acc: 91.5029, loss_bbox: 0.2937, loss_mask: 0.2841, loss: 0.9232 2024-05-29 12:19:10,542 - mmdet - INFO - Epoch [2][3050/7330] lr: 1.000e-04, eta: 14:56:44, time: 0.722, data_time: 0.045, memory: 11626, loss_rpn_cls: 0.0359, loss_rpn_bbox: 0.0601, loss_cls: 0.2558, acc: 91.4998, loss_bbox: 0.2890, loss_mask: 0.2833, loss: 0.9241 2024-05-29 12:19:45,703 - mmdet - INFO - Epoch [2][3100/7330] lr: 1.000e-04, eta: 14:56:13, time: 0.703, data_time: 0.045, memory: 11626, loss_rpn_cls: 0.0341, loss_rpn_bbox: 0.0583, loss_cls: 0.2495, acc: 91.6995, loss_bbox: 0.2902, loss_mask: 0.2887, loss: 0.9208 2024-05-29 12:20:24,152 - mmdet - INFO - Epoch [2][3150/7330] lr: 1.000e-04, eta: 14:56:06, time: 0.769, data_time: 0.057, memory: 11626, loss_rpn_cls: 0.0342, loss_rpn_bbox: 0.0587, loss_cls: 0.2619, acc: 91.3059, loss_bbox: 0.2975, loss_mask: 0.2813, loss: 0.9336 2024-05-29 12:20:56,982 - mmdet - INFO - Epoch [2][3200/7330] lr: 1.000e-04, eta: 14:55:18, time: 0.657, data_time: 0.051, memory: 11626, loss_rpn_cls: 0.0304, loss_rpn_bbox: 0.0548, loss_cls: 0.2421, acc: 91.9314, loss_bbox: 0.2843, loss_mask: 0.2767, loss: 0.8882 2024-05-29 12:21:30,408 - mmdet - INFO - Epoch [2][3250/7330] lr: 1.000e-04, eta: 14:54:34, time: 0.669, data_time: 0.052, memory: 11626, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0605, loss_cls: 0.2646, acc: 90.9807, loss_bbox: 0.3077, loss_mask: 0.2882, loss: 0.9547 2024-05-29 12:22:03,730 - mmdet - INFO - Epoch [2][3300/7330] lr: 1.000e-04, eta: 14:53:49, time: 0.666, data_time: 0.052, memory: 11626, loss_rpn_cls: 0.0327, loss_rpn_bbox: 0.0557, loss_cls: 0.2558, acc: 91.3848, loss_bbox: 0.2934, loss_mask: 0.2834, loss: 0.9209 2024-05-29 12:22:36,833 - mmdet - INFO - Epoch [2][3350/7330] lr: 1.000e-04, eta: 14:53:03, time: 0.662, data_time: 0.049, memory: 11626, loss_rpn_cls: 0.0326, loss_rpn_bbox: 0.0601, loss_cls: 0.2438, acc: 91.6785, loss_bbox: 0.2842, loss_mask: 0.2765, loss: 0.8972 2024-05-29 12:23:10,245 - mmdet - INFO - Epoch [2][3400/7330] lr: 1.000e-04, eta: 14:52:20, time: 0.668, data_time: 0.054, memory: 11626, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0601, loss_cls: 0.2617, acc: 91.2727, loss_bbox: 0.2910, loss_mask: 0.2879, loss: 0.9332 2024-05-29 12:23:44,270 - mmdet - INFO - Epoch [2][3450/7330] lr: 1.000e-04, eta: 14:51:40, time: 0.681, data_time: 0.044, memory: 11626, loss_rpn_cls: 0.0366, loss_rpn_bbox: 0.0608, loss_cls: 0.2529, acc: 91.4409, loss_bbox: 0.2954, loss_mask: 0.2855, loss: 0.9313 2024-05-29 12:24:17,393 - mmdet - INFO - Epoch [2][3500/7330] lr: 1.000e-04, eta: 14:50:55, time: 0.662, data_time: 0.042, memory: 11626, loss_rpn_cls: 0.0314, loss_rpn_bbox: 0.0580, loss_cls: 0.2558, acc: 91.3950, loss_bbox: 0.2956, loss_mask: 0.2841, loss: 0.9249 2024-05-29 12:24:50,505 - mmdet - INFO - Epoch [2][3550/7330] lr: 1.000e-04, eta: 14:50:09, time: 0.662, data_time: 0.064, memory: 11626, loss_rpn_cls: 0.0356, loss_rpn_bbox: 0.0606, loss_cls: 0.2523, acc: 91.5171, loss_bbox: 0.2926, loss_mask: 0.2839, loss: 0.9249 2024-05-29 12:25:23,259 - mmdet - INFO - Epoch [2][3600/7330] lr: 1.000e-04, eta: 14:49:21, time: 0.655, data_time: 0.054, memory: 11626, loss_rpn_cls: 0.0364, loss_rpn_bbox: 0.0607, loss_cls: 0.2520, acc: 91.4590, loss_bbox: 0.2953, loss_mask: 0.2849, loss: 0.9294 2024-05-29 12:26:01,712 - mmdet - INFO - Epoch [2][3650/7330] lr: 1.000e-04, eta: 14:49:13, time: 0.769, data_time: 0.055, memory: 11626, loss_rpn_cls: 0.0329, loss_rpn_bbox: 0.0603, loss_cls: 0.2606, acc: 91.3357, loss_bbox: 0.2998, loss_mask: 0.2887, loss: 0.9423 2024-05-29 12:26:39,047 - mmdet - INFO - Epoch [2][3700/7330] lr: 1.000e-04, eta: 14:48:57, time: 0.747, data_time: 0.049, memory: 11626, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0598, loss_cls: 0.2585, acc: 91.4282, loss_bbox: 0.2962, loss_mask: 0.2822, loss: 0.9271 2024-05-29 12:27:12,575 - mmdet - INFO - Epoch [2][3750/7330] lr: 1.000e-04, eta: 14:48:15, time: 0.670, data_time: 0.051, memory: 11626, loss_rpn_cls: 0.0362, loss_rpn_bbox: 0.0626, loss_cls: 0.2517, acc: 91.4141, loss_bbox: 0.2950, loss_mask: 0.2864, loss: 0.9319 2024-05-29 12:27:46,102 - mmdet - INFO - Epoch [2][3800/7330] lr: 1.000e-04, eta: 14:47:32, time: 0.671, data_time: 0.043, memory: 11626, loss_rpn_cls: 0.0326, loss_rpn_bbox: 0.0576, loss_cls: 0.2453, acc: 91.6572, loss_bbox: 0.2920, loss_mask: 0.2830, loss: 0.9105 2024-05-29 12:28:19,100 - mmdet - INFO - Epoch [2][3850/7330] lr: 1.000e-04, eta: 14:46:46, time: 0.660, data_time: 0.049, memory: 11626, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0583, loss_cls: 0.2443, acc: 91.7998, loss_bbox: 0.2827, loss_mask: 0.2813, loss: 0.8961 2024-05-29 12:28:52,444 - mmdet - INFO - Epoch [2][3900/7330] lr: 1.000e-04, eta: 14:46:03, time: 0.667, data_time: 0.042, memory: 11626, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0566, loss_cls: 0.2523, acc: 91.5493, loss_bbox: 0.2838, loss_mask: 0.2801, loss: 0.9031 2024-05-29 12:29:25,803 - mmdet - INFO - Epoch [2][3950/7330] lr: 1.000e-04, eta: 14:45:19, time: 0.667, data_time: 0.044, memory: 11626, loss_rpn_cls: 0.0322, loss_rpn_bbox: 0.0589, loss_cls: 0.2454, acc: 91.6799, loss_bbox: 0.2879, loss_mask: 0.2837, loss: 0.9082 2024-05-29 12:30:01,787 - mmdet - INFO - Epoch [2][4000/7330] lr: 1.000e-04, eta: 14:44:54, time: 0.720, data_time: 0.056, memory: 11626, loss_rpn_cls: 0.0344, loss_rpn_bbox: 0.0586, loss_cls: 0.2563, acc: 91.4326, loss_bbox: 0.2971, loss_mask: 0.2841, loss: 0.9305 2024-05-29 12:30:37,063 - mmdet - INFO - Epoch [2][4050/7330] lr: 1.000e-04, eta: 14:44:24, time: 0.706, data_time: 0.049, memory: 11626, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0575, loss_cls: 0.2427, acc: 91.7314, loss_bbox: 0.2854, loss_mask: 0.2816, loss: 0.8974 2024-05-29 12:31:14,788 - mmdet - INFO - Epoch [2][4100/7330] lr: 1.000e-04, eta: 14:44:09, time: 0.754, data_time: 0.043, memory: 11626, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0571, loss_cls: 0.2482, acc: 91.6379, loss_bbox: 0.2845, loss_mask: 0.2864, loss: 0.9081 2024-05-29 12:31:47,821 - mmdet - INFO - Epoch [2][4150/7330] lr: 1.000e-04, eta: 14:43:24, time: 0.661, data_time: 0.053, memory: 11626, loss_rpn_cls: 0.0341, loss_rpn_bbox: 0.0607, loss_cls: 0.2506, acc: 91.5657, loss_bbox: 0.2921, loss_mask: 0.2822, loss: 0.9197 2024-05-29 12:32:21,019 - mmdet - INFO - Epoch [2][4200/7330] lr: 1.000e-04, eta: 14:42:40, time: 0.664, data_time: 0.034, memory: 11626, loss_rpn_cls: 0.0306, loss_rpn_bbox: 0.0597, loss_cls: 0.2472, acc: 91.6184, loss_bbox: 0.2896, loss_mask: 0.2786, loss: 0.9057 2024-05-29 12:32:54,178 - mmdet - INFO - Epoch [2][4250/7330] lr: 1.000e-04, eta: 14:41:55, time: 0.663, data_time: 0.050, memory: 11626, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0592, loss_cls: 0.2557, acc: 91.3970, loss_bbox: 0.2970, loss_mask: 0.2779, loss: 0.9206 2024-05-29 12:33:27,306 - mmdet - INFO - Epoch [2][4300/7330] lr: 1.000e-04, eta: 14:41:11, time: 0.663, data_time: 0.050, memory: 11626, loss_rpn_cls: 0.0332, loss_rpn_bbox: 0.0560, loss_cls: 0.2506, acc: 91.5586, loss_bbox: 0.2872, loss_mask: 0.2754, loss: 0.9024 2024-05-29 12:34:00,617 - mmdet - INFO - Epoch [2][4350/7330] lr: 1.000e-04, eta: 14:40:27, time: 0.666, data_time: 0.043, memory: 11626, loss_rpn_cls: 0.0321, loss_rpn_bbox: 0.0588, loss_cls: 0.2551, acc: 91.4304, loss_bbox: 0.2936, loss_mask: 0.2800, loss: 0.9196 2024-05-29 12:34:33,630 - mmdet - INFO - Epoch [2][4400/7330] lr: 1.000e-04, eta: 14:39:42, time: 0.660, data_time: 0.041, memory: 11626, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0562, loss_cls: 0.2445, acc: 91.7188, loss_bbox: 0.2849, loss_mask: 0.2896, loss: 0.9087 2024-05-29 12:35:06,990 - mmdet - INFO - Epoch [2][4450/7330] lr: 1.000e-04, eta: 14:38:59, time: 0.667, data_time: 0.043, memory: 11626, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0575, loss_cls: 0.2491, acc: 91.6030, loss_bbox: 0.2852, loss_mask: 0.2850, loss: 0.9090 2024-05-29 12:35:40,346 - mmdet - INFO - Epoch [2][4500/7330] lr: 1.000e-04, eta: 14:38:17, time: 0.667, data_time: 0.063, memory: 11626, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0600, loss_cls: 0.2504, acc: 91.5691, loss_bbox: 0.2919, loss_mask: 0.2806, loss: 0.9153 2024-05-29 12:36:13,565 - mmdet - INFO - Epoch [2][4550/7330] lr: 1.000e-04, eta: 14:37:33, time: 0.664, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0304, loss_rpn_bbox: 0.0566, loss_cls: 0.2419, acc: 91.8918, loss_bbox: 0.2792, loss_mask: 0.2807, loss: 0.8888 2024-05-29 12:36:56,826 - mmdet - INFO - Epoch [2][4600/7330] lr: 1.000e-04, eta: 14:37:54, time: 0.865, data_time: 0.059, memory: 11626, loss_rpn_cls: 0.0310, loss_rpn_bbox: 0.0569, loss_cls: 0.2489, acc: 91.6702, loss_bbox: 0.2827, loss_mask: 0.2852, loss: 0.9047 2024-05-29 12:37:30,474 - mmdet - INFO - Epoch [2][4650/7330] lr: 1.000e-04, eta: 14:37:13, time: 0.673, data_time: 0.057, memory: 11626, loss_rpn_cls: 0.0342, loss_rpn_bbox: 0.0619, loss_cls: 0.2564, acc: 91.2720, loss_bbox: 0.2949, loss_mask: 0.2812, loss: 0.9286 2024-05-29 12:38:04,229 - mmdet - INFO - Epoch [2][4700/7330] lr: 1.000e-04, eta: 14:36:33, time: 0.675, data_time: 0.053, memory: 11626, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0592, loss_cls: 0.2554, acc: 91.4028, loss_bbox: 0.2913, loss_mask: 0.2847, loss: 0.9244 2024-05-29 12:38:37,741 - mmdet - INFO - Epoch [2][4750/7330] lr: 1.000e-04, eta: 14:35:51, time: 0.670, data_time: 0.040, memory: 11626, loss_rpn_cls: 0.0337, loss_rpn_bbox: 0.0586, loss_cls: 0.2636, acc: 91.0989, loss_bbox: 0.3042, loss_mask: 0.2850, loss: 0.9451 2024-05-29 12:39:11,486 - mmdet - INFO - Epoch [2][4800/7330] lr: 1.000e-04, eta: 14:35:11, time: 0.675, data_time: 0.045, memory: 11626, loss_rpn_cls: 0.0316, loss_rpn_bbox: 0.0561, loss_cls: 0.2349, acc: 91.9976, loss_bbox: 0.2795, loss_mask: 0.2760, loss: 0.8781 2024-05-29 12:39:44,532 - mmdet - INFO - Epoch [2][4850/7330] lr: 1.000e-04, eta: 14:34:26, time: 0.661, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0567, loss_cls: 0.2463, acc: 91.6006, loss_bbox: 0.2932, loss_mask: 0.2861, loss: 0.9108 2024-05-29 12:40:20,183 - mmdet - INFO - Epoch [2][4900/7330] lr: 1.000e-04, eta: 14:33:58, time: 0.713, data_time: 0.054, memory: 11626, loss_rpn_cls: 0.0319, loss_rpn_bbox: 0.0597, loss_cls: 0.2513, acc: 91.4937, loss_bbox: 0.2922, loss_mask: 0.2817, loss: 0.9169 2024-05-29 12:40:55,727 - mmdet - INFO - Epoch [2][4950/7330] lr: 1.000e-04, eta: 14:33:29, time: 0.711, data_time: 0.056, memory: 11626, loss_rpn_cls: 0.0331, loss_rpn_bbox: 0.0596, loss_cls: 0.2493, acc: 91.4343, loss_bbox: 0.2911, loss_mask: 0.2789, loss: 0.9119 2024-05-29 12:41:33,454 - mmdet - INFO - Epoch [2][5000/7330] lr: 1.000e-04, eta: 14:33:14, time: 0.755, data_time: 0.041, memory: 11626, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0569, loss_cls: 0.2361, acc: 92.0413, loss_bbox: 0.2737, loss_mask: 0.2805, loss: 0.8781 2024-05-29 12:42:06,825 - mmdet - INFO - Epoch [2][5050/7330] lr: 1.000e-04, eta: 14:32:31, time: 0.667, data_time: 0.040, memory: 11626, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0596, loss_cls: 0.2377, acc: 91.8560, loss_bbox: 0.2785, loss_mask: 0.2743, loss: 0.8826 2024-05-29 12:42:39,911 - mmdet - INFO - Epoch [2][5100/7330] lr: 1.000e-04, eta: 14:31:47, time: 0.662, data_time: 0.044, memory: 11626, loss_rpn_cls: 0.0310, loss_rpn_bbox: 0.0547, loss_cls: 0.2304, acc: 92.2190, loss_bbox: 0.2710, loss_mask: 0.2694, loss: 0.8565 2024-05-29 12:43:13,173 - mmdet - INFO - Epoch [2][5150/7330] lr: 1.000e-04, eta: 14:31:04, time: 0.665, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0507, loss_cls: 0.2332, acc: 92.1094, loss_bbox: 0.2747, loss_mask: 0.2768, loss: 0.8644 2024-05-29 12:43:47,053 - mmdet - INFO - Epoch [2][5200/7330] lr: 1.000e-04, eta: 14:30:25, time: 0.678, data_time: 0.052, memory: 11626, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0582, loss_cls: 0.2407, acc: 91.7729, loss_bbox: 0.2798, loss_mask: 0.2824, loss: 0.8931 2024-05-29 12:44:20,794 - mmdet - INFO - Epoch [2][5250/7330] lr: 1.000e-04, eta: 14:29:45, time: 0.675, data_time: 0.053, memory: 11626, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0566, loss_cls: 0.2406, acc: 91.8555, loss_bbox: 0.2805, loss_mask: 0.2756, loss: 0.8858 2024-05-29 12:44:53,939 - mmdet - INFO - Epoch [2][5300/7330] lr: 1.000e-04, eta: 14:29:02, time: 0.663, data_time: 0.055, memory: 11626, loss_rpn_cls: 0.0319, loss_rpn_bbox: 0.0583, loss_cls: 0.2436, acc: 91.8242, loss_bbox: 0.2827, loss_mask: 0.2803, loss: 0.8967 2024-05-29 12:45:27,156 - mmdet - INFO - Epoch [2][5350/7330] lr: 1.000e-04, eta: 14:28:19, time: 0.664, data_time: 0.048, memory: 11626, loss_rpn_cls: 0.0331, loss_rpn_bbox: 0.0587, loss_cls: 0.2439, acc: 91.6367, loss_bbox: 0.2875, loss_mask: 0.2782, loss: 0.9014 2024-05-29 12:46:00,369 - mmdet - INFO - Epoch [2][5400/7330] lr: 1.000e-04, eta: 14:27:36, time: 0.664, data_time: 0.046, memory: 11626, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0562, loss_cls: 0.2439, acc: 91.8337, loss_bbox: 0.2765, loss_mask: 0.2810, loss: 0.8872 2024-05-29 12:46:33,540 - mmdet - INFO - Epoch [2][5450/7330] lr: 1.000e-04, eta: 14:26:53, time: 0.663, data_time: 0.045, memory: 11626, loss_rpn_cls: 0.0319, loss_rpn_bbox: 0.0574, loss_cls: 0.2518, acc: 91.4431, loss_bbox: 0.2879, loss_mask: 0.2804, loss: 0.9094 2024-05-29 12:47:13,980 - mmdet - INFO - Epoch [2][5500/7330] lr: 1.000e-04, eta: 14:26:53, time: 0.809, data_time: 0.059, memory: 11626, loss_rpn_cls: 0.0339, loss_rpn_bbox: 0.0626, loss_cls: 0.2636, acc: 91.0491, loss_bbox: 0.3017, loss_mask: 0.2813, loss: 0.9431 2024-05-29 12:47:49,416 - mmdet - INFO - Epoch [2][5550/7330] lr: 1.000e-04, eta: 14:26:23, time: 0.709, data_time: 0.051, memory: 11626, loss_rpn_cls: 0.0316, loss_rpn_bbox: 0.0567, loss_cls: 0.2403, acc: 91.8054, loss_bbox: 0.2842, loss_mask: 0.2838, loss: 0.8966 2024-05-29 12:48:23,004 - mmdet - INFO - Epoch [2][5600/7330] lr: 1.000e-04, eta: 14:25:42, time: 0.672, data_time: 0.064, memory: 11626, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0547, loss_cls: 0.2461, acc: 91.6201, loss_bbox: 0.2891, loss_mask: 0.2801, loss: 0.8992 2024-05-29 12:48:56,599 - mmdet - INFO - Epoch [2][5650/7330] lr: 1.000e-04, eta: 14:25:02, time: 0.672, data_time: 0.058, memory: 11626, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0606, loss_cls: 0.2594, acc: 91.3035, loss_bbox: 0.2965, loss_mask: 0.2859, loss: 0.9363 2024-05-29 12:49:29,478 - mmdet - INFO - Epoch [2][5700/7330] lr: 1.000e-04, eta: 14:24:17, time: 0.658, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0548, loss_cls: 0.2370, acc: 92.1604, loss_bbox: 0.2689, loss_mask: 0.2781, loss: 0.8697 2024-05-29 12:50:02,620 - mmdet - INFO - Epoch [2][5750/7330] lr: 1.000e-04, eta: 14:23:34, time: 0.663, data_time: 0.055, memory: 11626, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0570, loss_cls: 0.2405, acc: 91.9844, loss_bbox: 0.2747, loss_mask: 0.2757, loss: 0.8788 2024-05-29 12:50:38,407 - mmdet - INFO - Epoch [2][5800/7330] lr: 1.000e-04, eta: 14:23:07, time: 0.716, data_time: 0.046, memory: 11626, loss_rpn_cls: 0.0306, loss_rpn_bbox: 0.0560, loss_cls: 0.2477, acc: 91.6401, loss_bbox: 0.2891, loss_mask: 0.2827, loss: 0.9061 2024-05-29 12:51:13,891 - mmdet - INFO - Epoch [2][5850/7330] lr: 1.000e-04, eta: 14:22:37, time: 0.710, data_time: 0.048, memory: 11626, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0537, loss_cls: 0.2404, acc: 91.8301, loss_bbox: 0.2784, loss_mask: 0.2722, loss: 0.8738 2024-05-29 12:51:51,654 - mmdet - INFO - Epoch [2][5900/7330] lr: 1.000e-04, eta: 14:22:20, time: 0.755, data_time: 0.049, memory: 11626, loss_rpn_cls: 0.0316, loss_rpn_bbox: 0.0548, loss_cls: 0.2551, acc: 91.5137, loss_bbox: 0.2879, loss_mask: 0.2732, loss: 0.9026 2024-05-29 12:52:24,466 - mmdet - INFO - Epoch [2][5950/7330] lr: 1.000e-04, eta: 14:21:35, time: 0.656, data_time: 0.047, memory: 11626, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0551, loss_cls: 0.2357, acc: 92.0520, loss_bbox: 0.2759, loss_mask: 0.2790, loss: 0.8774 2024-05-29 12:52:57,202 - mmdet - INFO - Epoch [2][6000/7330] lr: 1.000e-04, eta: 14:20:50, time: 0.655, data_time: 0.048, memory: 11626, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0546, loss_cls: 0.2452, acc: 91.6589, loss_bbox: 0.2785, loss_mask: 0.2767, loss: 0.8838 2024-05-29 12:53:30,235 - mmdet - INFO - Epoch [2][6050/7330] lr: 1.000e-04, eta: 14:20:07, time: 0.661, data_time: 0.059, memory: 11626, loss_rpn_cls: 0.0361, loss_rpn_bbox: 0.0586, loss_cls: 0.2417, acc: 91.6829, loss_bbox: 0.2852, loss_mask: 0.2789, loss: 0.9004 2024-05-29 12:54:03,360 - mmdet - INFO - Epoch [2][6100/7330] lr: 1.000e-04, eta: 14:19:24, time: 0.663, data_time: 0.055, memory: 11626, loss_rpn_cls: 0.0312, loss_rpn_bbox: 0.0561, loss_cls: 0.2572, acc: 91.3528, loss_bbox: 0.2927, loss_mask: 0.2804, loss: 0.9177 2024-05-29 12:54:36,365 - mmdet - INFO - Epoch [2][6150/7330] lr: 1.000e-04, eta: 14:18:41, time: 0.660, data_time: 0.053, memory: 11626, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0546, loss_cls: 0.2365, acc: 92.0261, loss_bbox: 0.2721, loss_mask: 0.2764, loss: 0.8716 2024-05-29 12:55:09,662 - mmdet - INFO - Epoch [2][6200/7330] lr: 1.000e-04, eta: 14:17:59, time: 0.666, data_time: 0.056, memory: 11626, loss_rpn_cls: 0.0327, loss_rpn_bbox: 0.0586, loss_cls: 0.2516, acc: 91.4685, loss_bbox: 0.2921, loss_mask: 0.2803, loss: 0.9155 2024-05-29 12:55:43,135 - mmdet - INFO - Epoch [2][6250/7330] lr: 1.000e-04, eta: 14:17:18, time: 0.669, data_time: 0.039, memory: 11626, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0572, loss_cls: 0.2455, acc: 91.6956, loss_bbox: 0.2870, loss_mask: 0.2798, loss: 0.9013 2024-05-29 12:56:16,111 - mmdet - INFO - Epoch [2][6300/7330] lr: 1.000e-04, eta: 14:16:35, time: 0.660, data_time: 0.050, memory: 11626, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0549, loss_cls: 0.2382, acc: 91.8357, loss_bbox: 0.2783, loss_mask: 0.2722, loss: 0.8738 2024-05-29 12:56:49,203 - mmdet - INFO - Epoch [2][6350/7330] lr: 1.000e-04, eta: 14:15:52, time: 0.662, data_time: 0.044, memory: 11626, loss_rpn_cls: 0.0310, loss_rpn_bbox: 0.0545, loss_cls: 0.2386, acc: 91.8750, loss_bbox: 0.2818, loss_mask: 0.2742, loss: 0.8802 2024-05-29 12:57:27,188 - mmdet - INFO - Epoch [2][6400/7330] lr: 1.000e-04, eta: 14:15:36, time: 0.760, data_time: 0.046, memory: 11626, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0561, loss_cls: 0.2441, acc: 91.7297, loss_bbox: 0.2816, loss_mask: 0.2766, loss: 0.8901 2024-05-29 12:58:04,599 - mmdet - INFO - Epoch [2][6450/7330] lr: 1.000e-04, eta: 14:15:17, time: 0.748, data_time: 0.048, memory: 11626, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0558, loss_cls: 0.2406, acc: 91.8169, loss_bbox: 0.2786, loss_mask: 0.2749, loss: 0.8787 2024-05-29 12:58:37,420 - mmdet - INFO - Epoch [2][6500/7330] lr: 1.000e-04, eta: 14:14:33, time: 0.656, data_time: 0.052, memory: 11626, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0543, loss_cls: 0.2409, acc: 91.7393, loss_bbox: 0.2791, loss_mask: 0.2773, loss: 0.8816 2024-05-29 12:59:10,325 - mmdet - INFO - Epoch [2][6550/7330] lr: 1.000e-04, eta: 14:13:49, time: 0.658, data_time: 0.053, memory: 11626, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0536, loss_cls: 0.2401, acc: 91.8713, loss_bbox: 0.2792, loss_mask: 0.2766, loss: 0.8793 2024-05-29 12:59:43,805 - mmdet - INFO - Epoch [2][6600/7330] lr: 1.000e-04, eta: 14:13:09, time: 0.670, data_time: 0.070, memory: 11626, loss_rpn_cls: 0.0307, loss_rpn_bbox: 0.0556, loss_cls: 0.2358, acc: 91.8770, loss_bbox: 0.2798, loss_mask: 0.2731, loss: 0.8749 2024-05-29 13:00:16,862 - mmdet - INFO - Epoch [2][6650/7330] lr: 1.000e-04, eta: 14:12:26, time: 0.661, data_time: 0.053, memory: 11626, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0559, loss_cls: 0.2443, acc: 91.6206, loss_bbox: 0.2884, loss_mask: 0.2751, loss: 0.8954 2024-05-29 13:00:49,910 - mmdet - INFO - Epoch [2][6700/7330] lr: 1.000e-04, eta: 14:11:44, time: 0.661, data_time: 0.045, memory: 11626, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0556, loss_cls: 0.2363, acc: 91.9045, loss_bbox: 0.2768, loss_mask: 0.2708, loss: 0.8715 2024-05-29 13:01:25,858 - mmdet - INFO - Epoch [2][6750/7330] lr: 1.000e-04, eta: 14:11:17, time: 0.719, data_time: 0.046, memory: 11626, loss_rpn_cls: 0.0307, loss_rpn_bbox: 0.0597, loss_cls: 0.2398, acc: 91.8379, loss_bbox: 0.2842, loss_mask: 0.2812, loss: 0.8956 2024-05-29 13:02:00,865 - mmdet - INFO - Epoch [2][6800/7330] lr: 1.000e-04, eta: 14:10:44, time: 0.700, data_time: 0.048, memory: 11626, loss_rpn_cls: 0.0305, loss_rpn_bbox: 0.0537, loss_cls: 0.2383, acc: 91.9395, loss_bbox: 0.2773, loss_mask: 0.2781, loss: 0.8779 2024-05-29 13:02:38,459 - mmdet - INFO - Epoch [2][6850/7330] lr: 1.000e-04, eta: 14:10:25, time: 0.752, data_time: 0.061, memory: 11626, loss_rpn_cls: 0.0307, loss_rpn_bbox: 0.0550, loss_cls: 0.2385, acc: 91.8555, loss_bbox: 0.2807, loss_mask: 0.2756, loss: 0.8805 2024-05-29 13:03:12,240 - mmdet - INFO - Epoch [2][6900/7330] lr: 1.000e-04, eta: 14:09:47, time: 0.676, data_time: 0.057, memory: 11626, loss_rpn_cls: 0.0313, loss_rpn_bbox: 0.0577, loss_cls: 0.2442, acc: 91.6987, loss_bbox: 0.2796, loss_mask: 0.2770, loss: 0.8898 2024-05-29 13:03:45,468 - mmdet - INFO - Epoch [2][6950/7330] lr: 1.000e-04, eta: 14:09:05, time: 0.665, data_time: 0.045, memory: 11626, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0580, loss_cls: 0.2512, acc: 91.7063, loss_bbox: 0.2812, loss_mask: 0.2745, loss: 0.8958 2024-05-29 13:04:18,688 - mmdet - INFO - Epoch [2][7000/7330] lr: 1.000e-04, eta: 14:08:24, time: 0.664, data_time: 0.043, memory: 11626, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0574, loss_cls: 0.2432, acc: 91.7812, loss_bbox: 0.2816, loss_mask: 0.2773, loss: 0.8920 2024-05-29 13:04:52,097 - mmdet - INFO - Epoch [2][7050/7330] lr: 1.000e-04, eta: 14:07:43, time: 0.668, data_time: 0.044, memory: 11626, loss_rpn_cls: 0.0322, loss_rpn_bbox: 0.0564, loss_cls: 0.2486, acc: 91.5945, loss_bbox: 0.2859, loss_mask: 0.2755, loss: 0.8986 2024-05-29 13:05:25,515 - mmdet - INFO - Epoch [2][7100/7330] lr: 1.000e-04, eta: 14:07:03, time: 0.668, data_time: 0.054, memory: 11626, loss_rpn_cls: 0.0306, loss_rpn_bbox: 0.0574, loss_cls: 0.2466, acc: 91.5244, loss_bbox: 0.2881, loss_mask: 0.2733, loss: 0.8960 2024-05-29 13:05:58,831 - mmdet - INFO - Epoch [2][7150/7330] lr: 1.000e-04, eta: 14:06:22, time: 0.666, data_time: 0.050, memory: 11626, loss_rpn_cls: 0.0310, loss_rpn_bbox: 0.0545, loss_cls: 0.2313, acc: 92.2151, loss_bbox: 0.2663, loss_mask: 0.2722, loss: 0.8552 2024-05-29 13:06:32,259 - mmdet - INFO - Epoch [2][7200/7330] lr: 1.000e-04, eta: 14:05:42, time: 0.669, data_time: 0.048, memory: 11626, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0582, loss_cls: 0.2425, acc: 91.7988, loss_bbox: 0.2780, loss_mask: 0.2707, loss: 0.8805 2024-05-29 13:07:05,420 - mmdet - INFO - Epoch [2][7250/7330] lr: 1.000e-04, eta: 14:05:00, time: 0.663, data_time: 0.054, memory: 11626, loss_rpn_cls: 0.0321, loss_rpn_bbox: 0.0541, loss_cls: 0.2375, acc: 92.0771, loss_bbox: 0.2684, loss_mask: 0.2716, loss: 0.8637 2024-05-29 13:07:38,601 - mmdet - INFO - Epoch [2][7300/7330] lr: 1.000e-04, eta: 14:04:19, time: 0.663, data_time: 0.050, memory: 11626, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0566, loss_cls: 0.2468, acc: 91.6794, loss_bbox: 0.2858, loss_mask: 0.2754, loss: 0.8965 2024-05-29 13:08:06,793 - mmdet - INFO - Saving checkpoint at 2 epochs 2024-05-29 13:10:02,068 - mmdet - INFO - Evaluating bbox... 2024-05-29 13:10:31,943 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.356 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.593 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.383 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.201 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.396 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.489 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.489 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.489 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.489 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.295 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.536 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.639 2024-05-29 13:10:31,944 - mmdet - INFO - Evaluating segm... 2024-05-29 13:11:03,151 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.333 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.558 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.349 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.138 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.365 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.516 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.453 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.453 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.453 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.243 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.505 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.630 2024-05-29 13:11:03,933 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 13:11:03,935 - mmdet - INFO - Epoch(val) [2][625] bbox_mAP: 0.3560, bbox_mAP_50: 0.5930, bbox_mAP_75: 0.3830, bbox_mAP_s: 0.2010, bbox_mAP_m: 0.3960, bbox_mAP_l: 0.4890, bbox_mAP_copypaste: 0.356 0.593 0.383 0.201 0.396 0.489, segm_mAP: 0.3330, segm_mAP_50: 0.5580, segm_mAP_75: 0.3490, segm_mAP_s: 0.1380, segm_mAP_m: 0.3650, segm_mAP_l: 0.5160, segm_mAP_copypaste: 0.333 0.558 0.349 0.138 0.365 0.516 2024-05-29 13:11:45,808 - mmdet - INFO - Epoch [3][50/7330] lr: 1.000e-04, eta: 14:02:17, time: 0.837, data_time: 0.115, memory: 11628, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0548, loss_cls: 0.2304, acc: 92.1086, loss_bbox: 0.2686, loss_mask: 0.2662, loss: 0.8465 2024-05-29 13:12:19,637 - mmdet - INFO - Epoch [3][100/7330] lr: 1.000e-04, eta: 14:01:39, time: 0.677, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0549, loss_cls: 0.2401, acc: 91.6743, loss_bbox: 0.2859, loss_mask: 0.2723, loss: 0.8805 2024-05-29 13:12:53,370 - mmdet - INFO - Epoch [3][150/7330] lr: 1.000e-04, eta: 14:01:00, time: 0.675, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0563, loss_cls: 0.2285, acc: 91.9324, loss_bbox: 0.2797, loss_mask: 0.2755, loss: 0.8678 2024-05-29 13:13:26,652 - mmdet - INFO - Epoch [3][200/7330] lr: 1.000e-04, eta: 14:00:20, time: 0.666, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0527, loss_cls: 0.2290, acc: 92.0479, loss_bbox: 0.2729, loss_mask: 0.2680, loss: 0.8496 2024-05-29 13:14:00,242 - mmdet - INFO - Epoch [3][250/7330] lr: 1.000e-04, eta: 13:59:41, time: 0.672, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0566, loss_cls: 0.2381, acc: 91.7161, loss_bbox: 0.2870, loss_mask: 0.2698, loss: 0.8789 2024-05-29 13:14:33,546 - mmdet - INFO - Epoch [3][300/7330] lr: 1.000e-04, eta: 13:59:01, time: 0.666, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0572, loss_cls: 0.2357, acc: 91.7954, loss_bbox: 0.2803, loss_mask: 0.2727, loss: 0.8749 2024-05-29 13:15:07,611 - mmdet - INFO - Epoch [3][350/7330] lr: 1.000e-04, eta: 13:58:24, time: 0.681, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0547, loss_cls: 0.2319, acc: 91.8467, loss_bbox: 0.2813, loss_mask: 0.2698, loss: 0.8660 2024-05-29 13:15:40,674 - mmdet - INFO - Epoch [3][400/7330] lr: 1.000e-04, eta: 13:57:43, time: 0.661, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0534, loss_cls: 0.2305, acc: 92.1226, loss_bbox: 0.2734, loss_mask: 0.2683, loss: 0.8528 2024-05-29 13:16:14,510 - mmdet - INFO - Epoch [3][450/7330] lr: 1.000e-04, eta: 13:57:05, time: 0.677, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0326, loss_rpn_bbox: 0.0579, loss_cls: 0.2322, acc: 92.0229, loss_bbox: 0.2776, loss_mask: 0.2715, loss: 0.8718 2024-05-29 13:16:47,961 - mmdet - INFO - Epoch [3][500/7330] lr: 1.000e-04, eta: 13:56:26, time: 0.669, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0517, loss_cls: 0.2139, acc: 92.5632, loss_bbox: 0.2606, loss_mask: 0.2629, loss: 0.8165 2024-05-29 13:17:21,216 - mmdet - INFO - Epoch [3][550/7330] lr: 1.000e-04, eta: 13:55:46, time: 0.665, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0520, loss_cls: 0.2226, acc: 92.2634, loss_bbox: 0.2675, loss_mask: 0.2712, loss: 0.8390 2024-05-29 13:17:54,609 - mmdet - INFO - Epoch [3][600/7330] lr: 1.000e-04, eta: 13:55:06, time: 0.668, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0534, loss_cls: 0.2244, acc: 92.2112, loss_bbox: 0.2696, loss_mask: 0.2678, loss: 0.8424 2024-05-29 13:18:28,216 - mmdet - INFO - Epoch [3][650/7330] lr: 1.000e-04, eta: 13:54:28, time: 0.672, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0548, loss_cls: 0.2360, acc: 91.7690, loss_bbox: 0.2789, loss_mask: 0.2667, loss: 0.8628 2024-05-29 13:19:01,996 - mmdet - INFO - Epoch [3][700/7330] lr: 1.000e-04, eta: 13:53:50, time: 0.676, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0544, loss_cls: 0.2293, acc: 92.0449, loss_bbox: 0.2722, loss_mask: 0.2729, loss: 0.8564 2024-05-29 13:19:43,082 - mmdet - INFO - Epoch [3][750/7330] lr: 1.000e-04, eta: 13:53:47, time: 0.822, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0539, loss_cls: 0.2315, acc: 91.8738, loss_bbox: 0.2820, loss_mask: 0.2731, loss: 0.8671 2024-05-29 13:20:19,007 - mmdet - INFO - Epoch [3][800/7330] lr: 1.000e-04, eta: 13:53:19, time: 0.718, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0534, loss_cls: 0.2350, acc: 91.8657, loss_bbox: 0.2791, loss_mask: 0.2717, loss: 0.8656 2024-05-29 13:20:52,892 - mmdet - INFO - Epoch [3][850/7330] lr: 1.000e-04, eta: 13:52:42, time: 0.678, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0580, loss_cls: 0.2365, acc: 91.7781, loss_bbox: 0.2813, loss_mask: 0.2665, loss: 0.8691 2024-05-29 13:21:26,820 - mmdet - INFO - Epoch [3][900/7330] lr: 1.000e-04, eta: 13:52:05, time: 0.678, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0556, loss_cls: 0.2367, acc: 91.7515, loss_bbox: 0.2781, loss_mask: 0.2664, loss: 0.8648 2024-05-29 13:22:03,067 - mmdet - INFO - Epoch [3][950/7330] lr: 1.000e-04, eta: 13:51:39, time: 0.725, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0536, loss_cls: 0.2258, acc: 92.1848, loss_bbox: 0.2710, loss_mask: 0.2629, loss: 0.8395 2024-05-29 13:22:39,670 - mmdet - INFO - Epoch [3][1000/7330] lr: 1.000e-04, eta: 13:51:14, time: 0.732, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0558, loss_cls: 0.2346, acc: 91.8320, loss_bbox: 0.2820, loss_mask: 0.2748, loss: 0.8747 2024-05-29 13:23:13,129 - mmdet - INFO - Epoch [3][1050/7330] lr: 1.000e-04, eta: 13:50:35, time: 0.669, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0518, loss_cls: 0.2271, acc: 92.2227, loss_bbox: 0.2693, loss_mask: 0.2646, loss: 0.8393 2024-05-29 13:23:48,799 - mmdet - INFO - Epoch [3][1100/7330] lr: 1.000e-04, eta: 13:50:05, time: 0.713, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0521, loss_cls: 0.2272, acc: 92.0786, loss_bbox: 0.2733, loss_mask: 0.2654, loss: 0.8453 2024-05-29 13:24:24,212 - mmdet - INFO - Epoch [3][1150/7330] lr: 1.000e-04, eta: 13:49:35, time: 0.708, data_time: 0.070, memory: 11628, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0514, loss_cls: 0.2202, acc: 92.2900, loss_bbox: 0.2649, loss_mask: 0.2664, loss: 0.8300 2024-05-29 13:24:57,576 - mmdet - INFO - Epoch [3][1200/7330] lr: 1.000e-04, eta: 13:48:56, time: 0.667, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0513, loss_cls: 0.2181, acc: 92.3447, loss_bbox: 0.2635, loss_mask: 0.2678, loss: 0.8275 2024-05-29 13:25:31,109 - mmdet - INFO - Epoch [3][1250/7330] lr: 1.000e-04, eta: 13:48:17, time: 0.671, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0553, loss_cls: 0.2313, acc: 92.0356, loss_bbox: 0.2750, loss_mask: 0.2650, loss: 0.8541 2024-05-29 13:26:04,770 - mmdet - INFO - Epoch [3][1300/7330] lr: 1.000e-04, eta: 13:47:39, time: 0.673, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0287, loss_rpn_bbox: 0.0546, loss_cls: 0.2275, acc: 92.1060, loss_bbox: 0.2731, loss_mask: 0.2699, loss: 0.8537 2024-05-29 13:26:38,380 - mmdet - INFO - Epoch [3][1350/7330] lr: 1.000e-04, eta: 13:47:00, time: 0.672, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0519, loss_cls: 0.2322, acc: 91.9998, loss_bbox: 0.2775, loss_mask: 0.2637, loss: 0.8521 2024-05-29 13:27:12,583 - mmdet - INFO - Epoch [3][1400/7330] lr: 1.000e-04, eta: 13:46:24, time: 0.684, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0319, loss_rpn_bbox: 0.0557, loss_cls: 0.2372, acc: 91.7334, loss_bbox: 0.2800, loss_mask: 0.2735, loss: 0.8783 2024-05-29 13:27:46,826 - mmdet - INFO - Epoch [3][1450/7330] lr: 1.000e-04, eta: 13:45:49, time: 0.685, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0555, loss_cls: 0.2402, acc: 91.7117, loss_bbox: 0.2823, loss_mask: 0.2652, loss: 0.8727 2024-05-29 13:28:20,891 - mmdet - INFO - Epoch [3][1500/7330] lr: 1.000e-04, eta: 13:45:12, time: 0.681, data_time: 0.068, memory: 11628, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0550, loss_cls: 0.2297, acc: 92.1223, loss_bbox: 0.2698, loss_mask: 0.2641, loss: 0.8450 2024-05-29 13:28:54,886 - mmdet - INFO - Epoch [3][1550/7330] lr: 1.000e-04, eta: 13:44:36, time: 0.680, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0562, loss_cls: 0.2327, acc: 91.9272, loss_bbox: 0.2744, loss_mask: 0.2673, loss: 0.8576 2024-05-29 13:29:28,905 - mmdet - INFO - Epoch [3][1600/7330] lr: 1.000e-04, eta: 13:43:59, time: 0.680, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0569, loss_cls: 0.2340, acc: 91.8181, loss_bbox: 0.2780, loss_mask: 0.2731, loss: 0.8722 2024-05-29 13:30:08,555 - mmdet - INFO - Epoch [3][1650/7330] lr: 1.000e-04, eta: 13:43:48, time: 0.793, data_time: 0.063, memory: 11628, loss_rpn_cls: 0.0284, loss_rpn_bbox: 0.0560, loss_cls: 0.2447, acc: 91.5081, loss_bbox: 0.2852, loss_mask: 0.2765, loss: 0.8908 2024-05-29 13:30:44,055 - mmdet - INFO - Epoch [3][1700/7330] lr: 1.000e-04, eta: 13:43:17, time: 0.710, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0511, loss_cls: 0.2242, acc: 92.1372, loss_bbox: 0.2738, loss_mask: 0.2644, loss: 0.8397 2024-05-29 13:31:17,941 - mmdet - INFO - Epoch [3][1750/7330] lr: 1.000e-04, eta: 13:42:40, time: 0.678, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0538, loss_cls: 0.2262, acc: 91.9966, loss_bbox: 0.2739, loss_mask: 0.2691, loss: 0.8494 2024-05-29 13:31:51,959 - mmdet - INFO - Epoch [3][1800/7330] lr: 1.000e-04, eta: 13:42:04, time: 0.680, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0545, loss_cls: 0.2290, acc: 91.9443, loss_bbox: 0.2749, loss_mask: 0.2645, loss: 0.8518 2024-05-29 13:32:28,425 - mmdet - INFO - Epoch [3][1850/7330] lr: 1.000e-04, eta: 13:41:38, time: 0.729, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0521, loss_cls: 0.2277, acc: 92.1633, loss_bbox: 0.2683, loss_mask: 0.2627, loss: 0.8385 2024-05-29 13:33:04,191 - mmdet - INFO - Epoch [3][1900/7330] lr: 1.000e-04, eta: 13:41:09, time: 0.715, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0287, loss_rpn_bbox: 0.0538, loss_cls: 0.2290, acc: 92.0625, loss_bbox: 0.2712, loss_mask: 0.2677, loss: 0.8504 2024-05-29 13:33:37,882 - mmdet - INFO - Epoch [3][1950/7330] lr: 1.000e-04, eta: 13:40:31, time: 0.674, data_time: 0.062, memory: 11628, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0508, loss_cls: 0.2189, acc: 92.3767, loss_bbox: 0.2636, loss_mask: 0.2608, loss: 0.8184 2024-05-29 13:34:13,031 - mmdet - INFO - Epoch [3][2000/7330] lr: 1.000e-04, eta: 13:39:59, time: 0.703, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0530, loss_cls: 0.2244, acc: 92.1633, loss_bbox: 0.2693, loss_mask: 0.2709, loss: 0.8433 2024-05-29 13:34:48,753 - mmdet - INFO - Epoch [3][2050/7330] lr: 1.000e-04, eta: 13:39:30, time: 0.714, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0551, loss_cls: 0.2324, acc: 91.9841, loss_bbox: 0.2751, loss_mask: 0.2653, loss: 0.8575 2024-05-29 13:35:22,368 - mmdet - INFO - Epoch [3][2100/7330] lr: 1.000e-04, eta: 13:38:51, time: 0.672, data_time: 0.068, memory: 11628, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0556, loss_cls: 0.2299, acc: 92.0493, loss_bbox: 0.2729, loss_mask: 0.2664, loss: 0.8511 2024-05-29 13:35:55,568 - mmdet - INFO - Epoch [3][2150/7330] lr: 1.000e-04, eta: 13:38:11, time: 0.664, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0511, loss_cls: 0.2104, acc: 92.7959, loss_bbox: 0.2582, loss_mask: 0.2642, loss: 0.8101 2024-05-29 13:36:28,719 - mmdet - INFO - Epoch [3][2200/7330] lr: 1.000e-04, eta: 13:37:31, time: 0.663, data_time: 0.063, memory: 11628, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0515, loss_cls: 0.2157, acc: 92.5093, loss_bbox: 0.2607, loss_mask: 0.2594, loss: 0.8128 2024-05-29 13:37:03,110 - mmdet - INFO - Epoch [3][2250/7330] lr: 1.000e-04, eta: 13:36:56, time: 0.688, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0565, loss_cls: 0.2317, acc: 91.9338, loss_bbox: 0.2837, loss_mask: 0.2749, loss: 0.8752 2024-05-29 13:37:35,927 - mmdet - INFO - Epoch [3][2300/7330] lr: 1.000e-04, eta: 13:36:15, time: 0.656, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0529, loss_cls: 0.2270, acc: 92.1541, loss_bbox: 0.2712, loss_mask: 0.2694, loss: 0.8490 2024-05-29 13:38:09,459 - mmdet - INFO - Epoch [3][2350/7330] lr: 1.000e-04, eta: 13:35:36, time: 0.671, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0532, loss_cls: 0.2235, acc: 92.1436, loss_bbox: 0.2738, loss_mask: 0.2680, loss: 0.8435 2024-05-29 13:38:43,025 - mmdet - INFO - Epoch [3][2400/7330] lr: 1.000e-04, eta: 13:34:58, time: 0.671, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0569, loss_cls: 0.2367, acc: 91.8320, loss_bbox: 0.2809, loss_mask: 0.2709, loss: 0.8710 2024-05-29 13:39:16,520 - mmdet - INFO - Epoch [3][2450/7330] lr: 1.000e-04, eta: 13:34:19, time: 0.670, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0554, loss_cls: 0.2297, acc: 91.9995, loss_bbox: 0.2766, loss_mask: 0.2672, loss: 0.8570 2024-05-29 13:39:50,132 - mmdet - INFO - Epoch [3][2500/7330] lr: 1.000e-04, eta: 13:33:41, time: 0.672, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0524, loss_cls: 0.2165, acc: 92.5623, loss_bbox: 0.2601, loss_mask: 0.2586, loss: 0.8155 2024-05-29 13:40:29,863 - mmdet - INFO - Epoch [3][2550/7330] lr: 1.000e-04, eta: 13:33:28, time: 0.795, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0547, loss_cls: 0.2313, acc: 91.9626, loss_bbox: 0.2732, loss_mask: 0.2635, loss: 0.8507 2024-05-29 13:41:03,503 - mmdet - INFO - Epoch [3][2600/7330] lr: 1.000e-04, eta: 13:32:50, time: 0.673, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0560, loss_cls: 0.2277, acc: 92.0774, loss_bbox: 0.2698, loss_mask: 0.2657, loss: 0.8469 2024-05-29 13:41:39,766 - mmdet - INFO - Epoch [3][2650/7330] lr: 1.000e-04, eta: 13:32:23, time: 0.725, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0565, loss_cls: 0.2294, acc: 92.0793, loss_bbox: 0.2697, loss_mask: 0.2706, loss: 0.8549 2024-05-29 13:42:13,351 - mmdet - INFO - Epoch [3][2700/7330] lr: 1.000e-04, eta: 13:31:45, time: 0.672, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0497, loss_cls: 0.2246, acc: 92.1531, loss_bbox: 0.2654, loss_mask: 0.2675, loss: 0.8339 2024-05-29 13:42:49,724 - mmdet - INFO - Epoch [3][2750/7330] lr: 1.000e-04, eta: 13:31:18, time: 0.727, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0544, loss_cls: 0.2382, acc: 91.7317, loss_bbox: 0.2822, loss_mask: 0.2699, loss: 0.8744 2024-05-29 13:43:25,613 - mmdet - INFO - Epoch [3][2800/7330] lr: 1.000e-04, eta: 13:30:49, time: 0.718, data_time: 0.066, memory: 11628, loss_rpn_cls: 0.0291, loss_rpn_bbox: 0.0560, loss_cls: 0.2382, acc: 91.8022, loss_bbox: 0.2802, loss_mask: 0.2655, loss: 0.8691 2024-05-29 13:43:58,997 - mmdet - INFO - Epoch [3][2850/7330] lr: 1.000e-04, eta: 13:30:10, time: 0.668, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0524, loss_cls: 0.2171, acc: 92.6331, loss_bbox: 0.2542, loss_mask: 0.2577, loss: 0.8103 2024-05-29 13:44:34,649 - mmdet - INFO - Epoch [3][2900/7330] lr: 1.000e-04, eta: 13:29:40, time: 0.713, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0547, loss_cls: 0.2243, acc: 92.1545, loss_bbox: 0.2662, loss_mask: 0.2625, loss: 0.8338 2024-05-29 13:45:10,089 - mmdet - INFO - Epoch [3][2950/7330] lr: 1.000e-04, eta: 13:29:10, time: 0.709, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0523, loss_cls: 0.2211, acc: 92.3115, loss_bbox: 0.2669, loss_mask: 0.2679, loss: 0.8346 2024-05-29 13:45:43,154 - mmdet - INFO - Epoch [3][3000/7330] lr: 1.000e-04, eta: 13:28:29, time: 0.661, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0552, loss_cls: 0.2224, acc: 92.2227, loss_bbox: 0.2668, loss_mask: 0.2617, loss: 0.8320 2024-05-29 13:46:16,281 - mmdet - INFO - Epoch [3][3050/7330] lr: 1.000e-04, eta: 13:27:49, time: 0.662, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0520, loss_cls: 0.2128, acc: 92.5996, loss_bbox: 0.2584, loss_mask: 0.2585, loss: 0.8094 2024-05-29 13:46:49,481 - mmdet - INFO - Epoch [3][3100/7330] lr: 1.000e-04, eta: 13:27:10, time: 0.664, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0567, loss_cls: 0.2330, acc: 91.9443, loss_bbox: 0.2779, loss_mask: 0.2702, loss: 0.8646 2024-05-29 13:47:23,602 - mmdet - INFO - Epoch [3][3150/7330] lr: 1.000e-04, eta: 13:26:34, time: 0.682, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0578, loss_cls: 0.2416, acc: 91.6638, loss_bbox: 0.2862, loss_mask: 0.2691, loss: 0.8830 2024-05-29 13:47:56,938 - mmdet - INFO - Epoch [3][3200/7330] lr: 1.000e-04, eta: 13:25:55, time: 0.667, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0536, loss_cls: 0.2323, acc: 92.0024, loss_bbox: 0.2728, loss_mask: 0.2629, loss: 0.8482 2024-05-29 13:48:30,217 - mmdet - INFO - Epoch [3][3250/7330] lr: 1.000e-04, eta: 13:25:16, time: 0.666, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0565, loss_cls: 0.2351, acc: 91.8489, loss_bbox: 0.2778, loss_mask: 0.2660, loss: 0.8647 2024-05-29 13:49:04,646 - mmdet - INFO - Epoch [3][3300/7330] lr: 1.000e-04, eta: 13:24:41, time: 0.689, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0568, loss_cls: 0.2350, acc: 91.7893, loss_bbox: 0.2768, loss_mask: 0.2639, loss: 0.8626 2024-05-29 13:49:37,922 - mmdet - INFO - Epoch [3][3350/7330] lr: 1.000e-04, eta: 13:24:02, time: 0.666, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0500, loss_cls: 0.2323, acc: 91.9136, loss_bbox: 0.2740, loss_mask: 0.2622, loss: 0.8453 2024-05-29 13:50:11,461 - mmdet - INFO - Epoch [3][3400/7330] lr: 1.000e-04, eta: 13:23:23, time: 0.671, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0505, loss_cls: 0.2278, acc: 92.0698, loss_bbox: 0.2731, loss_mask: 0.2614, loss: 0.8391 2024-05-29 13:50:52,160 - mmdet - INFO - Epoch [3][3450/7330] lr: 1.000e-04, eta: 13:23:13, time: 0.814, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0525, loss_cls: 0.2273, acc: 92.1697, loss_bbox: 0.2713, loss_mask: 0.2662, loss: 0.8441 2024-05-29 13:51:25,318 - mmdet - INFO - Epoch [3][3500/7330] lr: 1.000e-04, eta: 13:22:33, time: 0.663, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0511, loss_cls: 0.2153, acc: 92.5156, loss_bbox: 0.2558, loss_mask: 0.2610, loss: 0.8075 2024-05-29 13:52:01,377 - mmdet - INFO - Epoch [3][3550/7330] lr: 1.000e-04, eta: 13:22:05, time: 0.721, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0567, loss_cls: 0.2286, acc: 91.9485, loss_bbox: 0.2751, loss_mask: 0.2675, loss: 0.8549 2024-05-29 13:52:34,408 - mmdet - INFO - Epoch [3][3600/7330] lr: 1.000e-04, eta: 13:21:25, time: 0.661, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0535, loss_cls: 0.2195, acc: 92.4668, loss_bbox: 0.2616, loss_mask: 0.2624, loss: 0.8232 2024-05-29 13:53:10,857 - mmdet - INFO - Epoch [3][3650/7330] lr: 1.000e-04, eta: 13:20:58, time: 0.729, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0544, loss_cls: 0.2303, acc: 92.0928, loss_bbox: 0.2714, loss_mask: 0.2713, loss: 0.8542 2024-05-29 13:53:46,525 - mmdet - INFO - Epoch [3][3700/7330] lr: 1.000e-04, eta: 13:20:28, time: 0.713, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0495, loss_cls: 0.2237, acc: 92.2854, loss_bbox: 0.2622, loss_mask: 0.2625, loss: 0.8247 2024-05-29 13:54:20,224 - mmdet - INFO - Epoch [3][3750/7330] lr: 1.000e-04, eta: 13:19:50, time: 0.674, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0287, loss_rpn_bbox: 0.0527, loss_cls: 0.2335, acc: 92.0310, loss_bbox: 0.2727, loss_mask: 0.2631, loss: 0.8506 2024-05-29 13:54:56,383 - mmdet - INFO - Epoch [3][3800/7330] lr: 1.000e-04, eta: 13:19:22, time: 0.723, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0511, loss_cls: 0.2295, acc: 92.1877, loss_bbox: 0.2668, loss_mask: 0.2649, loss: 0.8374 2024-05-29 13:55:31,902 - mmdet - INFO - Epoch [3][3850/7330] lr: 1.000e-04, eta: 13:18:51, time: 0.710, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0520, loss_cls: 0.2238, acc: 92.2344, loss_bbox: 0.2669, loss_mask: 0.2618, loss: 0.8313 2024-05-29 13:56:05,417 - mmdet - INFO - Epoch [3][3900/7330] lr: 1.000e-04, eta: 13:18:13, time: 0.670, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0540, loss_cls: 0.2279, acc: 92.0535, loss_bbox: 0.2743, loss_mask: 0.2676, loss: 0.8517 2024-05-29 13:56:38,503 - mmdet - INFO - Epoch [3][3950/7330] lr: 1.000e-04, eta: 13:17:33, time: 0.662, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0522, loss_cls: 0.2183, acc: 92.3108, loss_bbox: 0.2584, loss_mask: 0.2620, loss: 0.8164 2024-05-29 13:57:12,063 - mmdet - INFO - Epoch [3][4000/7330] lr: 1.000e-04, eta: 13:16:55, time: 0.671, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0541, loss_cls: 0.2368, acc: 91.7944, loss_bbox: 0.2786, loss_mask: 0.2654, loss: 0.8617 2024-05-29 13:57:45,488 - mmdet - INFO - Epoch [3][4050/7330] lr: 1.000e-04, eta: 13:16:17, time: 0.669, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0551, loss_cls: 0.2219, acc: 92.2451, loss_bbox: 0.2681, loss_mask: 0.2601, loss: 0.8314 2024-05-29 13:58:19,056 - mmdet - INFO - Epoch [3][4100/7330] lr: 1.000e-04, eta: 13:15:39, time: 0.671, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0534, loss_cls: 0.2255, acc: 92.0874, loss_bbox: 0.2682, loss_mask: 0.2670, loss: 0.8415 2024-05-29 13:58:53,030 - mmdet - INFO - Epoch [3][4150/7330] lr: 1.000e-04, eta: 13:15:02, time: 0.679, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0540, loss_cls: 0.2243, acc: 92.2065, loss_bbox: 0.2664, loss_mask: 0.2636, loss: 0.8356 2024-05-29 13:59:26,652 - mmdet - INFO - Epoch [3][4200/7330] lr: 1.000e-04, eta: 13:14:25, time: 0.672, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0536, loss_cls: 0.2263, acc: 92.1667, loss_bbox: 0.2660, loss_mask: 0.2662, loss: 0.8404 2024-05-29 14:00:00,746 - mmdet - INFO - Epoch [3][4250/7330] lr: 1.000e-04, eta: 13:13:49, time: 0.682, data_time: 0.071, memory: 11628, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0541, loss_cls: 0.2207, acc: 92.3372, loss_bbox: 0.2649, loss_mask: 0.2646, loss: 0.8307 2024-05-29 14:00:34,034 - mmdet - INFO - Epoch [3][4300/7330] lr: 1.000e-04, eta: 13:13:10, time: 0.666, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0521, loss_cls: 0.2318, acc: 91.9734, loss_bbox: 0.2750, loss_mask: 0.2688, loss: 0.8539 2024-05-29 14:01:16,497 - mmdet - INFO - Epoch [3][4350/7330] lr: 1.000e-04, eta: 13:13:04, time: 0.849, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0511, loss_cls: 0.2144, acc: 92.4673, loss_bbox: 0.2595, loss_mask: 0.2617, loss: 0.8109 2024-05-29 14:01:49,624 - mmdet - INFO - Epoch [3][4400/7330] lr: 1.000e-04, eta: 13:12:25, time: 0.663, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0519, loss_cls: 0.2282, acc: 92.1514, loss_bbox: 0.2634, loss_mask: 0.2608, loss: 0.8300 2024-05-29 14:02:25,431 - mmdet - INFO - Epoch [3][4450/7330] lr: 1.000e-04, eta: 13:11:55, time: 0.716, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0531, loss_cls: 0.2282, acc: 92.1487, loss_bbox: 0.2720, loss_mask: 0.2658, loss: 0.8462 2024-05-29 14:02:58,221 - mmdet - INFO - Epoch [3][4500/7330] lr: 1.000e-04, eta: 13:11:14, time: 0.656, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0518, loss_cls: 0.2258, acc: 92.1736, loss_bbox: 0.2660, loss_mask: 0.2658, loss: 0.8364 2024-05-29 14:03:33,797 - mmdet - INFO - Epoch [3][4550/7330] lr: 1.000e-04, eta: 13:10:44, time: 0.711, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0517, loss_cls: 0.2280, acc: 92.1431, loss_bbox: 0.2742, loss_mask: 0.2610, loss: 0.8417 2024-05-29 14:04:10,373 - mmdet - INFO - Epoch [3][4600/7330] lr: 1.000e-04, eta: 13:10:16, time: 0.732, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0521, loss_cls: 0.2279, acc: 92.1682, loss_bbox: 0.2692, loss_mask: 0.2620, loss: 0.8360 2024-05-29 14:04:43,977 - mmdet - INFO - Epoch [3][4650/7330] lr: 1.000e-04, eta: 13:09:39, time: 0.672, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0522, loss_cls: 0.2316, acc: 91.9985, loss_bbox: 0.2723, loss_mask: 0.2637, loss: 0.8453 2024-05-29 14:05:19,604 - mmdet - INFO - Epoch [3][4700/7330] lr: 1.000e-04, eta: 13:09:08, time: 0.713, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0539, loss_cls: 0.2213, acc: 92.3459, loss_bbox: 0.2663, loss_mask: 0.2587, loss: 0.8251 2024-05-29 14:05:55,078 - mmdet - INFO - Epoch [3][4750/7330] lr: 1.000e-04, eta: 13:08:37, time: 0.709, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0500, loss_cls: 0.2173, acc: 92.5276, loss_bbox: 0.2552, loss_mask: 0.2587, loss: 0.8072 2024-05-29 14:06:28,715 - mmdet - INFO - Epoch [3][4800/7330] lr: 1.000e-04, eta: 13:07:59, time: 0.673, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0524, loss_cls: 0.2347, acc: 91.7271, loss_bbox: 0.2846, loss_mask: 0.2643, loss: 0.8618 2024-05-29 14:07:01,863 - mmdet - INFO - Epoch [3][4850/7330] lr: 1.000e-04, eta: 13:07:20, time: 0.663, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0542, loss_cls: 0.2240, acc: 92.1582, loss_bbox: 0.2715, loss_mask: 0.2585, loss: 0.8353 2024-05-29 14:07:35,146 - mmdet - INFO - Epoch [3][4900/7330] lr: 1.000e-04, eta: 13:06:41, time: 0.666, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0536, loss_cls: 0.2297, acc: 91.9353, loss_bbox: 0.2765, loss_mask: 0.2649, loss: 0.8513 2024-05-29 14:08:08,557 - mmdet - INFO - Epoch [3][4950/7330] lr: 1.000e-04, eta: 13:06:03, time: 0.668, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0531, loss_cls: 0.2233, acc: 92.2349, loss_bbox: 0.2630, loss_mask: 0.2630, loss: 0.8302 2024-05-29 14:08:42,081 - mmdet - INFO - Epoch [3][5000/7330] lr: 1.000e-04, eta: 13:05:25, time: 0.670, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0534, loss_cls: 0.2319, acc: 91.9058, loss_bbox: 0.2714, loss_mask: 0.2661, loss: 0.8523 2024-05-29 14:09:15,302 - mmdet - INFO - Epoch [3][5050/7330] lr: 1.000e-04, eta: 13:04:46, time: 0.664, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0523, loss_cls: 0.2186, acc: 92.3882, loss_bbox: 0.2566, loss_mask: 0.2593, loss: 0.8127 2024-05-29 14:09:48,589 - mmdet - INFO - Epoch [3][5100/7330] lr: 1.000e-04, eta: 13:04:08, time: 0.666, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0553, loss_cls: 0.2275, acc: 92.0833, loss_bbox: 0.2693, loss_mask: 0.2654, loss: 0.8458 2024-05-29 14:10:21,301 - mmdet - INFO - Epoch [3][5150/7330] lr: 1.000e-04, eta: 13:03:27, time: 0.654, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0514, loss_cls: 0.2207, acc: 92.4089, loss_bbox: 0.2607, loss_mask: 0.2641, loss: 0.8238 2024-05-29 14:10:54,785 - mmdet - INFO - Epoch [3][5200/7330] lr: 1.000e-04, eta: 13:02:49, time: 0.670, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0512, loss_cls: 0.2133, acc: 92.7178, loss_bbox: 0.2539, loss_mask: 0.2657, loss: 0.8117 2024-05-29 14:11:35,530 - mmdet - INFO - Epoch [3][5250/7330] lr: 1.000e-04, eta: 13:02:36, time: 0.815, data_time: 0.071, memory: 11628, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0521, loss_cls: 0.2207, acc: 92.3635, loss_bbox: 0.2603, loss_mask: 0.2562, loss: 0.8128 2024-05-29 14:12:09,109 - mmdet - INFO - Epoch [3][5300/7330] lr: 1.000e-04, eta: 13:01:58, time: 0.672, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0542, loss_cls: 0.2319, acc: 91.9468, loss_bbox: 0.2807, loss_mask: 0.2654, loss: 0.8598 2024-05-29 14:12:45,220 - mmdet - INFO - Epoch [3][5350/7330] lr: 1.000e-04, eta: 13:01:29, time: 0.722, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0526, loss_cls: 0.2338, acc: 91.9785, loss_bbox: 0.2728, loss_mask: 0.2651, loss: 0.8528 2024-05-29 14:13:18,702 - mmdet - INFO - Epoch [3][5400/7330] lr: 1.000e-04, eta: 13:00:51, time: 0.670, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0565, loss_cls: 0.2289, acc: 92.1216, loss_bbox: 0.2794, loss_mask: 0.2689, loss: 0.8611 2024-05-29 14:13:51,853 - mmdet - INFO - Epoch [3][5450/7330] lr: 1.000e-04, eta: 13:00:12, time: 0.663, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0531, loss_cls: 0.2266, acc: 92.1011, loss_bbox: 0.2691, loss_mask: 0.2584, loss: 0.8337 2024-05-29 14:14:29,712 - mmdet - INFO - Epoch [3][5500/7330] lr: 1.000e-04, eta: 12:59:49, time: 0.757, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0520, loss_cls: 0.2266, acc: 92.2588, loss_bbox: 0.2647, loss_mask: 0.2620, loss: 0.8308 2024-05-29 14:15:03,182 - mmdet - INFO - Epoch [3][5550/7330] lr: 1.000e-04, eta: 12:59:11, time: 0.669, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0284, loss_rpn_bbox: 0.0529, loss_cls: 0.2215, acc: 92.3018, loss_bbox: 0.2648, loss_mask: 0.2610, loss: 0.8285 2024-05-29 14:15:38,405 - mmdet - INFO - Epoch [3][5600/7330] lr: 1.000e-04, eta: 12:58:39, time: 0.705, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0291, loss_rpn_bbox: 0.0520, loss_cls: 0.2153, acc: 92.4404, loss_bbox: 0.2567, loss_mask: 0.2592, loss: 0.8124 2024-05-29 14:16:14,472 - mmdet - INFO - Epoch [3][5650/7330] lr: 1.000e-04, eta: 12:58:10, time: 0.721, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0485, loss_cls: 0.2168, acc: 92.4336, loss_bbox: 0.2577, loss_mask: 0.2568, loss: 0.8047 2024-05-29 14:16:47,378 - mmdet - INFO - Epoch [3][5700/7330] lr: 1.000e-04, eta: 12:57:30, time: 0.658, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0542, loss_cls: 0.2251, acc: 92.2576, loss_bbox: 0.2662, loss_mask: 0.2695, loss: 0.8416 2024-05-29 14:17:19,961 - mmdet - INFO - Epoch [3][5750/7330] lr: 1.000e-04, eta: 12:56:49, time: 0.652, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0525, loss_cls: 0.2234, acc: 92.3015, loss_bbox: 0.2606, loss_mask: 0.2630, loss: 0.8260 2024-05-29 14:17:53,606 - mmdet - INFO - Epoch [3][5800/7330] lr: 1.000e-04, eta: 12:56:12, time: 0.673, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0514, loss_cls: 0.2182, acc: 92.4487, loss_bbox: 0.2580, loss_mask: 0.2593, loss: 0.8118 2024-05-29 14:18:26,633 - mmdet - INFO - Epoch [3][5850/7330] lr: 1.000e-04, eta: 12:55:32, time: 0.661, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0287, loss_rpn_bbox: 0.0500, loss_cls: 0.2213, acc: 92.3728, loss_bbox: 0.2561, loss_mask: 0.2617, loss: 0.8179 2024-05-29 14:18:59,697 - mmdet - INFO - Epoch [3][5900/7330] lr: 1.000e-04, eta: 12:54:53, time: 0.661, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0514, loss_cls: 0.2270, acc: 92.2705, loss_bbox: 0.2643, loss_mask: 0.2576, loss: 0.8279 2024-05-29 14:19:33,306 - mmdet - INFO - Epoch [3][5950/7330] lr: 1.000e-04, eta: 12:54:16, time: 0.672, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0289, loss_rpn_bbox: 0.0546, loss_cls: 0.2269, acc: 92.1052, loss_bbox: 0.2720, loss_mask: 0.2604, loss: 0.8428 2024-05-29 14:20:06,737 - mmdet - INFO - Epoch [3][6000/7330] lr: 1.000e-04, eta: 12:53:38, time: 0.669, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0517, loss_cls: 0.2246, acc: 92.1230, loss_bbox: 0.2696, loss_mask: 0.2622, loss: 0.8368 2024-05-29 14:20:40,513 - mmdet - INFO - Epoch [3][6050/7330] lr: 1.000e-04, eta: 12:53:01, time: 0.676, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0551, loss_cls: 0.2252, acc: 92.1362, loss_bbox: 0.2685, loss_mask: 0.2636, loss: 0.8391 2024-05-29 14:21:14,144 - mmdet - INFO - Epoch [3][6100/7330] lr: 1.000e-04, eta: 12:52:24, time: 0.673, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0496, loss_cls: 0.2148, acc: 92.5930, loss_bbox: 0.2498, loss_mask: 0.2532, loss: 0.7916 2024-05-29 14:21:54,858 - mmdet - INFO - Epoch [3][6150/7330] lr: 1.000e-04, eta: 12:52:09, time: 0.814, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0534, loss_cls: 0.2256, acc: 92.0610, loss_bbox: 0.2723, loss_mask: 0.2613, loss: 0.8388 2024-05-29 14:22:28,403 - mmdet - INFO - Epoch [3][6200/7330] lr: 1.000e-04, eta: 12:51:32, time: 0.671, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0499, loss_cls: 0.2089, acc: 92.7275, loss_bbox: 0.2540, loss_mask: 0.2580, loss: 0.7940 2024-05-29 14:23:03,860 - mmdet - INFO - Epoch [3][6250/7330] lr: 1.000e-04, eta: 12:51:00, time: 0.709, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0504, loss_cls: 0.2301, acc: 91.9780, loss_bbox: 0.2697, loss_mask: 0.2608, loss: 0.8370 2024-05-29 14:23:37,829 - mmdet - INFO - Epoch [3][6300/7330] lr: 1.000e-04, eta: 12:50:24, time: 0.679, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0281, loss_rpn_bbox: 0.0562, loss_cls: 0.2363, acc: 91.8416, loss_bbox: 0.2725, loss_mask: 0.2657, loss: 0.8588 2024-05-29 14:24:11,344 - mmdet - INFO - Epoch [3][6350/7330] lr: 1.000e-04, eta: 12:49:47, time: 0.670, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0555, loss_cls: 0.2316, acc: 92.0122, loss_bbox: 0.2698, loss_mask: 0.2638, loss: 0.8490 2024-05-29 14:24:47,421 - mmdet - INFO - Epoch [3][6400/7330] lr: 1.000e-04, eta: 12:49:17, time: 0.722, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0495, loss_cls: 0.2171, acc: 92.5020, loss_bbox: 0.2606, loss_mask: 0.2622, loss: 0.8158 2024-05-29 14:25:23,405 - mmdet - INFO - Epoch [3][6450/7330] lr: 1.000e-04, eta: 12:48:47, time: 0.720, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0556, loss_cls: 0.2352, acc: 91.7888, loss_bbox: 0.2785, loss_mask: 0.2702, loss: 0.8673 2024-05-29 14:25:58,744 - mmdet - INFO - Epoch [3][6500/7330] lr: 1.000e-04, eta: 12:48:16, time: 0.707, data_time: 0.067, memory: 11628, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0517, loss_cls: 0.2198, acc: 92.3921, loss_bbox: 0.2594, loss_mask: 0.2599, loss: 0.8178 2024-05-29 14:26:34,510 - mmdet - INFO - Epoch [3][6550/7330] lr: 1.000e-04, eta: 12:47:45, time: 0.715, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0520, loss_cls: 0.2174, acc: 92.4832, loss_bbox: 0.2587, loss_mask: 0.2611, loss: 0.8142 2024-05-29 14:27:07,772 - mmdet - INFO - Epoch [3][6600/7330] lr: 1.000e-04, eta: 12:47:07, time: 0.665, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0499, loss_cls: 0.2220, acc: 92.4004, loss_bbox: 0.2634, loss_mask: 0.2576, loss: 0.8200 2024-05-29 14:27:41,389 - mmdet - INFO - Epoch [3][6650/7330] lr: 1.000e-04, eta: 12:46:29, time: 0.672, data_time: 0.066, memory: 11628, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0522, loss_cls: 0.2271, acc: 92.1641, loss_bbox: 0.2635, loss_mask: 0.2628, loss: 0.8309 2024-05-29 14:28:15,132 - mmdet - INFO - Epoch [3][6700/7330] lr: 1.000e-04, eta: 12:45:52, time: 0.675, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0514, loss_cls: 0.2213, acc: 92.3245, loss_bbox: 0.2611, loss_mask: 0.2583, loss: 0.8190 2024-05-29 14:28:48,012 - mmdet - INFO - Epoch [3][6750/7330] lr: 1.000e-04, eta: 12:45:13, time: 0.658, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0512, loss_cls: 0.2134, acc: 92.4539, loss_bbox: 0.2600, loss_mask: 0.2549, loss: 0.8038 2024-05-29 14:29:21,067 - mmdet - INFO - Epoch [3][6800/7330] lr: 1.000e-04, eta: 12:44:34, time: 0.661, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0500, loss_cls: 0.2168, acc: 92.4795, loss_bbox: 0.2580, loss_mask: 0.2565, loss: 0.8056 2024-05-29 14:29:54,083 - mmdet - INFO - Epoch [3][6850/7330] lr: 1.000e-04, eta: 12:43:55, time: 0.660, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0509, loss_cls: 0.2192, acc: 92.3083, loss_bbox: 0.2606, loss_mask: 0.2644, loss: 0.8197 2024-05-29 14:30:27,401 - mmdet - INFO - Epoch [3][6900/7330] lr: 1.000e-04, eta: 12:43:17, time: 0.666, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0530, loss_cls: 0.2214, acc: 92.3718, loss_bbox: 0.2652, loss_mask: 0.2668, loss: 0.8349 2024-05-29 14:31:01,053 - mmdet - INFO - Epoch [3][6950/7330] lr: 1.000e-04, eta: 12:42:40, time: 0.673, data_time: 0.066, memory: 11628, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0542, loss_cls: 0.2225, acc: 92.1409, loss_bbox: 0.2699, loss_mask: 0.2687, loss: 0.8410 2024-05-29 14:31:34,275 - mmdet - INFO - Epoch [3][7000/7330] lr: 1.000e-04, eta: 12:42:01, time: 0.664, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0513, loss_cls: 0.2202, acc: 92.4702, loss_bbox: 0.2576, loss_mask: 0.2590, loss: 0.8135 2024-05-29 14:32:14,456 - mmdet - INFO - Epoch [3][7050/7330] lr: 1.000e-04, eta: 12:41:44, time: 0.804, data_time: 0.065, memory: 11628, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0533, loss_cls: 0.2222, acc: 92.3108, loss_bbox: 0.2634, loss_mask: 0.2558, loss: 0.8220 2024-05-29 14:32:48,146 - mmdet - INFO - Epoch [3][7100/7330] lr: 1.000e-04, eta: 12:41:07, time: 0.674, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0525, loss_cls: 0.2170, acc: 92.4292, loss_bbox: 0.2581, loss_mask: 0.2535, loss: 0.8067 2024-05-29 14:33:24,499 - mmdet - INFO - Epoch [3][7150/7330] lr: 1.000e-04, eta: 12:40:39, time: 0.727, data_time: 0.064, memory: 11628, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0536, loss_cls: 0.2273, acc: 92.0200, loss_bbox: 0.2705, loss_mask: 0.2589, loss: 0.8386 2024-05-29 14:33:58,439 - mmdet - INFO - Epoch [3][7200/7330] lr: 1.000e-04, eta: 12:40:02, time: 0.679, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0516, loss_cls: 0.2221, acc: 92.3525, loss_bbox: 0.2623, loss_mask: 0.2584, loss: 0.8203 2024-05-29 14:34:34,233 - mmdet - INFO - Epoch [3][7250/7330] lr: 1.000e-04, eta: 12:39:32, time: 0.716, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0512, loss_cls: 0.2209, acc: 92.3997, loss_bbox: 0.2636, loss_mask: 0.2615, loss: 0.8211 2024-05-29 14:35:07,575 - mmdet - INFO - Epoch [3][7300/7330] lr: 1.000e-04, eta: 12:38:54, time: 0.667, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0504, loss_cls: 0.2188, acc: 92.4062, loss_bbox: 0.2637, loss_mask: 0.2547, loss: 0.8117 2024-05-29 14:35:30,310 - mmdet - INFO - Saving checkpoint at 3 epochs 2024-05-29 14:37:29,602 - mmdet - INFO - Evaluating bbox... 2024-05-29 14:37:59,625 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.384 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.621 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.421 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.220 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.429 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.522 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.521 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.521 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.521 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.333 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.573 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.671 2024-05-29 14:37:59,625 - mmdet - INFO - Evaluating segm... 2024-05-29 14:38:29,981 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.358 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.586 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.378 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.156 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.394 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.550 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.482 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.482 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.482 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.280 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.533 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.651 2024-05-29 14:38:30,577 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 14:38:30,579 - mmdet - INFO - Epoch(val) [3][625] bbox_mAP: 0.3840, bbox_mAP_50: 0.6210, bbox_mAP_75: 0.4210, bbox_mAP_s: 0.2200, bbox_mAP_m: 0.4290, bbox_mAP_l: 0.5220, bbox_mAP_copypaste: 0.384 0.621 0.421 0.220 0.429 0.522, segm_mAP: 0.3580, segm_mAP_50: 0.5860, segm_mAP_75: 0.3780, segm_mAP_s: 0.1560, segm_mAP_m: 0.3940, segm_mAP_l: 0.5500, segm_mAP_copypaste: 0.358 0.586 0.378 0.156 0.394 0.550 2024-05-29 14:39:07,185 - mmdet - INFO - Epoch [4][50/7330] lr: 1.000e-04, eta: 12:37:03, time: 0.732, data_time: 0.106, memory: 11628, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0484, loss_cls: 0.2030, acc: 92.6079, loss_bbox: 0.2533, loss_mask: 0.2508, loss: 0.7773 2024-05-29 14:39:41,504 - mmdet - INFO - Epoch [4][100/7330] lr: 1.000e-04, eta: 12:36:28, time: 0.686, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0530, loss_cls: 0.2191, acc: 92.1746, loss_bbox: 0.2704, loss_mask: 0.2612, loss: 0.8278 2024-05-29 14:40:15,070 - mmdet - INFO - Epoch [4][150/7330] lr: 1.000e-04, eta: 12:35:51, time: 0.671, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0494, loss_cls: 0.2096, acc: 92.6150, loss_bbox: 0.2528, loss_mask: 0.2495, loss: 0.7830 2024-05-29 14:40:50,176 - mmdet - INFO - Epoch [4][200/7330] lr: 1.000e-04, eta: 12:35:18, time: 0.702, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0510, loss_cls: 0.2112, acc: 92.5637, loss_bbox: 0.2584, loss_mask: 0.2520, loss: 0.7953 2024-05-29 14:41:26,027 - mmdet - INFO - Epoch [4][250/7330] lr: 1.000e-04, eta: 12:34:48, time: 0.717, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0509, loss_cls: 0.2176, acc: 92.2515, loss_bbox: 0.2636, loss_mask: 0.2516, loss: 0.8058 2024-05-29 14:42:03,620 - mmdet - INFO - Epoch [4][300/7330] lr: 1.000e-04, eta: 12:34:23, time: 0.752, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0509, loss_cls: 0.2155, acc: 92.3804, loss_bbox: 0.2559, loss_mask: 0.2536, loss: 0.7992 2024-05-29 14:42:36,883 - mmdet - INFO - Epoch [4][350/7330] lr: 1.000e-04, eta: 12:33:45, time: 0.665, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0519, loss_cls: 0.2082, acc: 92.5408, loss_bbox: 0.2553, loss_mask: 0.2553, loss: 0.7956 2024-05-29 14:43:10,047 - mmdet - INFO - Epoch [4][400/7330] lr: 1.000e-04, eta: 12:33:07, time: 0.663, data_time: 0.038, memory: 11628, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0480, loss_cls: 0.2150, acc: 92.2939, loss_bbox: 0.2655, loss_mask: 0.2571, loss: 0.8088 2024-05-29 14:43:43,459 - mmdet - INFO - Epoch [4][450/7330] lr: 1.000e-04, eta: 12:32:29, time: 0.668, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0527, loss_cls: 0.2197, acc: 92.1882, loss_bbox: 0.2650, loss_mask: 0.2583, loss: 0.8210 2024-05-29 14:44:19,471 - mmdet - INFO - Epoch [4][500/7330] lr: 1.000e-04, eta: 12:31:59, time: 0.720, data_time: 0.038, memory: 11628, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0489, loss_cls: 0.2149, acc: 92.3796, loss_bbox: 0.2587, loss_mask: 0.2559, loss: 0.8020 2024-05-29 14:44:52,985 - mmdet - INFO - Epoch [4][550/7330] lr: 1.000e-04, eta: 12:31:22, time: 0.670, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0481, loss_cls: 0.2071, acc: 92.8250, loss_bbox: 0.2451, loss_mask: 0.2468, loss: 0.7707 2024-05-29 14:45:26,390 - mmdet - INFO - Epoch [4][600/7330] lr: 1.000e-04, eta: 12:30:45, time: 0.668, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0493, loss_cls: 0.2022, acc: 92.8572, loss_bbox: 0.2526, loss_mask: 0.2506, loss: 0.7768 2024-05-29 14:46:02,129 - mmdet - INFO - Epoch [4][650/7330] lr: 1.000e-04, eta: 12:30:14, time: 0.715, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0502, loss_cls: 0.2115, acc: 92.4441, loss_bbox: 0.2569, loss_mask: 0.2548, loss: 0.7968 2024-05-29 14:46:35,424 - mmdet - INFO - Epoch [4][700/7330] lr: 1.000e-04, eta: 12:29:36, time: 0.666, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0512, loss_cls: 0.2049, acc: 92.7722, loss_bbox: 0.2501, loss_mask: 0.2546, loss: 0.7844 2024-05-29 14:47:08,630 - mmdet - INFO - Epoch [4][750/7330] lr: 1.000e-04, eta: 12:28:58, time: 0.664, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0527, loss_cls: 0.2143, acc: 92.4077, loss_bbox: 0.2596, loss_mask: 0.2570, loss: 0.8087 2024-05-29 14:47:41,805 - mmdet - INFO - Epoch [4][800/7330] lr: 1.000e-04, eta: 12:28:20, time: 0.664, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0515, loss_cls: 0.2114, acc: 92.4834, loss_bbox: 0.2620, loss_mask: 0.2635, loss: 0.8121 2024-05-29 14:48:14,916 - mmdet - INFO - Epoch [4][850/7330] lr: 1.000e-04, eta: 12:27:42, time: 0.662, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0521, loss_cls: 0.2051, acc: 92.5842, loss_bbox: 0.2561, loss_mask: 0.2546, loss: 0.7919 2024-05-29 14:48:48,667 - mmdet - INFO - Epoch [4][900/7330] lr: 1.000e-04, eta: 12:27:05, time: 0.675, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0503, loss_cls: 0.2136, acc: 92.4202, loss_bbox: 0.2614, loss_mask: 0.2569, loss: 0.8048 2024-05-29 14:49:25,058 - mmdet - INFO - Epoch [4][950/7330] lr: 1.000e-04, eta: 12:26:36, time: 0.728, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0470, loss_cls: 0.2082, acc: 92.4658, loss_bbox: 0.2602, loss_mask: 0.2477, loss: 0.7849 2024-05-29 14:50:01,089 - mmdet - INFO - Epoch [4][1000/7330] lr: 1.000e-04, eta: 12:26:06, time: 0.721, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0497, loss_cls: 0.2195, acc: 92.2690, loss_bbox: 0.2628, loss_mask: 0.2559, loss: 0.8110 2024-05-29 14:50:33,943 - mmdet - INFO - Epoch [4][1050/7330] lr: 1.000e-04, eta: 12:25:27, time: 0.657, data_time: 0.036, memory: 11628, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0484, loss_cls: 0.2097, acc: 92.6213, loss_bbox: 0.2533, loss_mask: 0.2576, loss: 0.7947 2024-05-29 14:51:10,453 - mmdet - INFO - Epoch [4][1100/7330] lr: 1.000e-04, eta: 12:24:59, time: 0.730, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0527, loss_cls: 0.2139, acc: 92.4346, loss_bbox: 0.2601, loss_mask: 0.2543, loss: 0.8056 2024-05-29 14:51:46,043 - mmdet - INFO - Epoch [4][1150/7330] lr: 1.000e-04, eta: 12:24:27, time: 0.712, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0538, loss_cls: 0.2169, acc: 92.3547, loss_bbox: 0.2631, loss_mask: 0.2541, loss: 0.8129 2024-05-29 14:52:24,114 - mmdet - INFO - Epoch [4][1200/7330] lr: 1.000e-04, eta: 12:24:03, time: 0.761, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0524, loss_cls: 0.2166, acc: 92.3123, loss_bbox: 0.2700, loss_mask: 0.2619, loss: 0.8236 2024-05-29 14:52:57,772 - mmdet - INFO - Epoch [4][1250/7330] lr: 1.000e-04, eta: 12:23:26, time: 0.673, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0500, loss_cls: 0.2115, acc: 92.4639, loss_bbox: 0.2589, loss_mask: 0.2504, loss: 0.7954 2024-05-29 14:53:31,078 - mmdet - INFO - Epoch [4][1300/7330] lr: 1.000e-04, eta: 12:22:49, time: 0.666, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0493, loss_cls: 0.2188, acc: 92.1804, loss_bbox: 0.2676, loss_mask: 0.2590, loss: 0.8178 2024-05-29 14:54:05,192 - mmdet - INFO - Epoch [4][1350/7330] lr: 1.000e-04, eta: 12:22:13, time: 0.682, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0520, loss_cls: 0.2207, acc: 92.1973, loss_bbox: 0.2680, loss_mask: 0.2540, loss: 0.8189 2024-05-29 14:54:40,230 - mmdet - INFO - Epoch [4][1400/7330] lr: 1.000e-04, eta: 12:21:40, time: 0.701, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0480, loss_cls: 0.2100, acc: 92.5557, loss_bbox: 0.2577, loss_mask: 0.2516, loss: 0.7905 2024-05-29 14:55:14,203 - mmdet - INFO - Epoch [4][1450/7330] lr: 1.000e-04, eta: 12:21:05, time: 0.679, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0474, loss_cls: 0.2067, acc: 92.6326, loss_bbox: 0.2542, loss_mask: 0.2512, loss: 0.7830 2024-05-29 14:55:47,377 - mmdet - INFO - Epoch [4][1500/7330] lr: 1.000e-04, eta: 12:20:27, time: 0.664, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0516, loss_cls: 0.2101, acc: 92.6050, loss_bbox: 0.2574, loss_mask: 0.2470, loss: 0.7911 2024-05-29 14:56:22,419 - mmdet - INFO - Epoch [4][1550/7330] lr: 1.000e-04, eta: 12:19:54, time: 0.701, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0484, loss_cls: 0.2092, acc: 92.5818, loss_bbox: 0.2550, loss_mask: 0.2489, loss: 0.7857 2024-05-29 14:56:55,403 - mmdet - INFO - Epoch [4][1600/7330] lr: 1.000e-04, eta: 12:19:15, time: 0.660, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0508, loss_cls: 0.2072, acc: 92.5078, loss_bbox: 0.2572, loss_mask: 0.2577, loss: 0.7951 2024-05-29 14:57:29,455 - mmdet - INFO - Epoch [4][1650/7330] lr: 1.000e-04, eta: 12:18:40, time: 0.681, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0496, loss_cls: 0.2017, acc: 92.8972, loss_bbox: 0.2473, loss_mask: 0.2513, loss: 0.7724 2024-05-29 14:58:02,544 - mmdet - INFO - Epoch [4][1700/7330] lr: 1.000e-04, eta: 12:18:02, time: 0.662, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0528, loss_cls: 0.2177, acc: 92.3135, loss_bbox: 0.2659, loss_mask: 0.2593, loss: 0.8200 2024-05-29 14:58:36,561 - mmdet - INFO - Epoch [4][1750/7330] lr: 1.000e-04, eta: 12:17:26, time: 0.680, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0529, loss_cls: 0.2151, acc: 92.2988, loss_bbox: 0.2578, loss_mask: 0.2539, loss: 0.8049 2024-05-29 14:59:10,818 - mmdet - INFO - Epoch [4][1800/7330] lr: 1.000e-04, eta: 12:16:51, time: 0.685, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0521, loss_cls: 0.2189, acc: 92.2019, loss_bbox: 0.2718, loss_mask: 0.2611, loss: 0.8293 2024-05-29 14:59:46,280 - mmdet - INFO - Epoch [4][1850/7330] lr: 1.000e-04, eta: 12:16:19, time: 0.709, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0463, loss_cls: 0.1958, acc: 93.0454, loss_bbox: 0.2383, loss_mask: 0.2445, loss: 0.7486 2024-05-29 15:00:20,987 - mmdet - INFO - Epoch [4][1900/7330] lr: 1.000e-04, eta: 12:15:46, time: 0.694, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0495, loss_cls: 0.2020, acc: 92.9067, loss_bbox: 0.2446, loss_mask: 0.2443, loss: 0.7625 2024-05-29 15:00:54,185 - mmdet - INFO - Epoch [4][1950/7330] lr: 1.000e-04, eta: 12:15:08, time: 0.664, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0490, loss_cls: 0.2142, acc: 92.5535, loss_bbox: 0.2593, loss_mask: 0.2540, loss: 0.7976 2024-05-29 15:01:30,664 - mmdet - INFO - Epoch [4][2000/7330] lr: 1.000e-04, eta: 12:14:39, time: 0.730, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0501, loss_cls: 0.2146, acc: 92.3796, loss_bbox: 0.2605, loss_mask: 0.2506, loss: 0.8007 2024-05-29 15:02:06,667 - mmdet - INFO - Epoch [4][2050/7330] lr: 1.000e-04, eta: 12:14:08, time: 0.720, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0525, loss_cls: 0.2096, acc: 92.5837, loss_bbox: 0.2594, loss_mask: 0.2533, loss: 0.8003 2024-05-29 15:02:47,079 - mmdet - INFO - Epoch [4][2100/7330] lr: 1.000e-04, eta: 12:13:50, time: 0.808, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0502, loss_cls: 0.2105, acc: 92.4836, loss_bbox: 0.2603, loss_mask: 0.2504, loss: 0.7942 2024-05-29 15:03:20,122 - mmdet - INFO - Epoch [4][2150/7330] lr: 1.000e-04, eta: 12:13:11, time: 0.661, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0515, loss_cls: 0.2091, acc: 92.5874, loss_bbox: 0.2542, loss_mask: 0.2543, loss: 0.7924 2024-05-29 15:03:53,875 - mmdet - INFO - Epoch [4][2200/7330] lr: 1.000e-04, eta: 12:12:35, time: 0.675, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0494, loss_cls: 0.2112, acc: 92.5547, loss_bbox: 0.2509, loss_mask: 0.2456, loss: 0.7821 2024-05-29 15:04:27,542 - mmdet - INFO - Epoch [4][2250/7330] lr: 1.000e-04, eta: 12:11:59, time: 0.673, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0502, loss_cls: 0.2064, acc: 92.8298, loss_bbox: 0.2454, loss_mask: 0.2482, loss: 0.7729 2024-05-29 15:05:03,325 - mmdet - INFO - Epoch [4][2300/7330] lr: 1.000e-04, eta: 12:11:28, time: 0.716, data_time: 0.038, memory: 11628, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0542, loss_cls: 0.2132, acc: 92.4243, loss_bbox: 0.2655, loss_mask: 0.2602, loss: 0.8176 2024-05-29 15:05:36,787 - mmdet - INFO - Epoch [4][2350/7330] lr: 1.000e-04, eta: 12:10:50, time: 0.669, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0545, loss_cls: 0.2151, acc: 92.3760, loss_bbox: 0.2610, loss_mask: 0.2573, loss: 0.8138 2024-05-29 15:06:10,168 - mmdet - INFO - Epoch [4][2400/7330] lr: 1.000e-04, eta: 12:10:13, time: 0.668, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0482, loss_cls: 0.2042, acc: 92.8252, loss_bbox: 0.2485, loss_mask: 0.2501, loss: 0.7749 2024-05-29 15:06:46,392 - mmdet - INFO - Epoch [4][2450/7330] lr: 1.000e-04, eta: 12:09:43, time: 0.724, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0512, loss_cls: 0.2133, acc: 92.4573, loss_bbox: 0.2616, loss_mask: 0.2623, loss: 0.8120 2024-05-29 15:07:20,015 - mmdet - INFO - Epoch [4][2500/7330] lr: 1.000e-04, eta: 12:09:07, time: 0.672, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0514, loss_cls: 0.2235, acc: 92.1343, loss_bbox: 0.2639, loss_mask: 0.2591, loss: 0.8214 2024-05-29 15:07:53,391 - mmdet - INFO - Epoch [4][2550/7330] lr: 1.000e-04, eta: 12:08:29, time: 0.668, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0501, loss_cls: 0.2137, acc: 92.4668, loss_bbox: 0.2618, loss_mask: 0.2616, loss: 0.8098 2024-05-29 15:08:26,517 - mmdet - INFO - Epoch [4][2600/7330] lr: 1.000e-04, eta: 12:07:51, time: 0.662, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0463, loss_cls: 0.2075, acc: 92.7251, loss_bbox: 0.2454, loss_mask: 0.2522, loss: 0.7733 2024-05-29 15:09:00,240 - mmdet - INFO - Epoch [4][2650/7330] lr: 1.000e-04, eta: 12:07:15, time: 0.674, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0481, loss_cls: 0.2011, acc: 92.9221, loss_bbox: 0.2423, loss_mask: 0.2447, loss: 0.7587 2024-05-29 15:09:33,280 - mmdet - INFO - Epoch [4][2700/7330] lr: 1.000e-04, eta: 12:06:37, time: 0.661, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0483, loss_cls: 0.2057, acc: 92.7271, loss_bbox: 0.2541, loss_mask: 0.2561, loss: 0.7894 2024-05-29 15:10:08,780 - mmdet - INFO - Epoch [4][2750/7330] lr: 1.000e-04, eta: 12:06:05, time: 0.710, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0521, loss_cls: 0.2125, acc: 92.4209, loss_bbox: 0.2622, loss_mask: 0.2539, loss: 0.8046 2024-05-29 15:10:44,246 - mmdet - INFO - Epoch [4][2800/7330] lr: 1.000e-04, eta: 12:05:33, time: 0.709, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0540, loss_cls: 0.2232, acc: 92.1643, loss_bbox: 0.2687, loss_mask: 0.2633, loss: 0.8340 2024-05-29 15:11:17,647 - mmdet - INFO - Epoch [4][2850/7330] lr: 1.000e-04, eta: 12:04:56, time: 0.668, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0516, loss_cls: 0.2122, acc: 92.5247, loss_bbox: 0.2523, loss_mask: 0.2516, loss: 0.7919 2024-05-29 15:11:53,716 - mmdet - INFO - Epoch [4][2900/7330] lr: 1.000e-04, eta: 12:04:26, time: 0.721, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0504, loss_cls: 0.2107, acc: 92.4905, loss_bbox: 0.2583, loss_mask: 0.2504, loss: 0.7933 2024-05-29 15:12:29,122 - mmdet - INFO - Epoch [4][2950/7330] lr: 1.000e-04, eta: 12:03:54, time: 0.708, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0505, loss_cls: 0.2122, acc: 92.5325, loss_bbox: 0.2561, loss_mask: 0.2539, loss: 0.7966 2024-05-29 15:13:07,788 - mmdet - INFO - Epoch [4][3000/7330] lr: 1.000e-04, eta: 12:03:30, time: 0.773, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0488, loss_cls: 0.2096, acc: 92.6562, loss_bbox: 0.2522, loss_mask: 0.2617, loss: 0.7966 2024-05-29 15:13:41,109 - mmdet - INFO - Epoch [4][3050/7330] lr: 1.000e-04, eta: 12:02:53, time: 0.666, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0493, loss_cls: 0.2056, acc: 92.8040, loss_bbox: 0.2465, loss_mask: 0.2497, loss: 0.7737 2024-05-29 15:14:13,981 - mmdet - INFO - Epoch [4][3100/7330] lr: 1.000e-04, eta: 12:02:14, time: 0.657, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0474, loss_cls: 0.2111, acc: 92.5303, loss_bbox: 0.2539, loss_mask: 0.2554, loss: 0.7880 2024-05-29 15:14:47,055 - mmdet - INFO - Epoch [4][3150/7330] lr: 1.000e-04, eta: 12:01:36, time: 0.661, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0468, loss_cls: 0.2026, acc: 92.8086, loss_bbox: 0.2489, loss_mask: 0.2486, loss: 0.7693 2024-05-29 15:15:23,348 - mmdet - INFO - Epoch [4][3200/7330] lr: 1.000e-04, eta: 12:01:06, time: 0.726, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0510, loss_cls: 0.2099, acc: 92.5876, loss_bbox: 0.2542, loss_mask: 0.2588, loss: 0.7984 2024-05-29 15:15:57,409 - mmdet - INFO - Epoch [4][3250/7330] lr: 1.000e-04, eta: 12:00:31, time: 0.681, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0526, loss_cls: 0.2164, acc: 92.4802, loss_bbox: 0.2605, loss_mask: 0.2523, loss: 0.8076 2024-05-29 15:16:31,087 - mmdet - INFO - Epoch [4][3300/7330] lr: 1.000e-04, eta: 11:59:54, time: 0.674, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0524, loss_cls: 0.2102, acc: 92.5627, loss_bbox: 0.2566, loss_mask: 0.2588, loss: 0.8042 2024-05-29 15:17:06,370 - mmdet - INFO - Epoch [4][3350/7330] lr: 1.000e-04, eta: 11:59:22, time: 0.706, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0529, loss_cls: 0.2120, acc: 92.4875, loss_bbox: 0.2614, loss_mask: 0.2609, loss: 0.8113 2024-05-29 15:17:39,813 - mmdet - INFO - Epoch [4][3400/7330] lr: 1.000e-04, eta: 11:58:45, time: 0.669, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0474, loss_cls: 0.2067, acc: 92.7607, loss_bbox: 0.2535, loss_mask: 0.2498, loss: 0.7812 2024-05-29 15:18:13,669 - mmdet - INFO - Epoch [4][3450/7330] lr: 1.000e-04, eta: 11:58:09, time: 0.677, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0520, loss_cls: 0.2132, acc: 92.5718, loss_bbox: 0.2610, loss_mask: 0.2622, loss: 0.8136 2024-05-29 15:18:47,276 - mmdet - INFO - Epoch [4][3500/7330] lr: 1.000e-04, eta: 11:57:32, time: 0.672, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0525, loss_cls: 0.2117, acc: 92.4099, loss_bbox: 0.2576, loss_mask: 0.2548, loss: 0.8010 2024-05-29 15:19:20,556 - mmdet - INFO - Epoch [4][3550/7330] lr: 1.000e-04, eta: 11:56:55, time: 0.666, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0508, loss_cls: 0.2131, acc: 92.5000, loss_bbox: 0.2557, loss_mask: 0.2531, loss: 0.7958 2024-05-29 15:19:54,842 - mmdet - INFO - Epoch [4][3600/7330] lr: 1.000e-04, eta: 11:56:20, time: 0.686, data_time: 0.069, memory: 11628, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0528, loss_cls: 0.2113, acc: 92.5347, loss_bbox: 0.2592, loss_mask: 0.2578, loss: 0.8063 2024-05-29 15:20:30,013 - mmdet - INFO - Epoch [4][3650/7330] lr: 1.000e-04, eta: 11:55:48, time: 0.703, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0474, loss_cls: 0.2011, acc: 92.8843, loss_bbox: 0.2460, loss_mask: 0.2505, loss: 0.7670 2024-05-29 15:21:06,431 - mmdet - INFO - Epoch [4][3700/7330] lr: 1.000e-04, eta: 11:55:18, time: 0.728, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0487, loss_cls: 0.2070, acc: 92.6890, loss_bbox: 0.2501, loss_mask: 0.2520, loss: 0.7819 2024-05-29 15:21:39,672 - mmdet - INFO - Epoch [4][3750/7330] lr: 1.000e-04, eta: 11:54:40, time: 0.665, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0491, loss_cls: 0.2151, acc: 92.4456, loss_bbox: 0.2527, loss_mask: 0.2472, loss: 0.7877 2024-05-29 15:22:15,559 - mmdet - INFO - Epoch [4][3800/7330] lr: 1.000e-04, eta: 11:54:09, time: 0.718, data_time: 0.037, memory: 11628, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0475, loss_cls: 0.2044, acc: 92.7473, loss_bbox: 0.2486, loss_mask: 0.2485, loss: 0.7708 2024-05-29 15:22:51,180 - mmdet - INFO - Epoch [4][3850/7330] lr: 1.000e-04, eta: 11:53:38, time: 0.712, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0510, loss_cls: 0.2159, acc: 92.2925, loss_bbox: 0.2628, loss_mask: 0.2546, loss: 0.8088 2024-05-29 15:23:29,123 - mmdet - INFO - Epoch [4][3900/7330] lr: 1.000e-04, eta: 11:53:12, time: 0.759, data_time: 0.037, memory: 11628, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0490, loss_cls: 0.1996, acc: 93.0486, loss_bbox: 0.2422, loss_mask: 0.2495, loss: 0.7615 2024-05-29 15:24:02,457 - mmdet - INFO - Epoch [4][3950/7330] lr: 1.000e-04, eta: 11:52:34, time: 0.667, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0499, loss_cls: 0.2188, acc: 92.2673, loss_bbox: 0.2588, loss_mask: 0.2575, loss: 0.8123 2024-05-29 15:24:36,351 - mmdet - INFO - Epoch [4][4000/7330] lr: 1.000e-04, eta: 11:51:59, time: 0.678, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0487, loss_cls: 0.2175, acc: 92.2930, loss_bbox: 0.2552, loss_mask: 0.2573, loss: 0.8027 2024-05-29 15:25:09,895 - mmdet - INFO - Epoch [4][4050/7330] lr: 1.000e-04, eta: 11:51:22, time: 0.671, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0513, loss_cls: 0.2082, acc: 92.7505, loss_bbox: 0.2464, loss_mask: 0.2509, loss: 0.7813 2024-05-29 15:25:45,784 - mmdet - INFO - Epoch [4][4100/7330] lr: 1.000e-04, eta: 11:50:51, time: 0.718, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0513, loss_cls: 0.2168, acc: 92.2351, loss_bbox: 0.2600, loss_mask: 0.2562, loss: 0.8086 2024-05-29 15:26:19,867 - mmdet - INFO - Epoch [4][4150/7330] lr: 1.000e-04, eta: 11:50:15, time: 0.682, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0504, loss_cls: 0.2121, acc: 92.5352, loss_bbox: 0.2544, loss_mask: 0.2568, loss: 0.7978 2024-05-29 15:26:53,053 - mmdet - INFO - Epoch [4][4200/7330] lr: 1.000e-04, eta: 11:49:38, time: 0.664, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0478, loss_cls: 0.2075, acc: 92.6099, loss_bbox: 0.2540, loss_mask: 0.2543, loss: 0.7872 2024-05-29 15:27:29,409 - mmdet - INFO - Epoch [4][4250/7330] lr: 1.000e-04, eta: 11:49:08, time: 0.727, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0507, loss_cls: 0.2100, acc: 92.5339, loss_bbox: 0.2542, loss_mask: 0.2582, loss: 0.7987 2024-05-29 15:28:03,044 - mmdet - INFO - Epoch [4][4300/7330] lr: 1.000e-04, eta: 11:48:31, time: 0.673, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0529, loss_cls: 0.2189, acc: 92.2620, loss_bbox: 0.2654, loss_mask: 0.2551, loss: 0.8172 2024-05-29 15:28:36,025 - mmdet - INFO - Epoch [4][4350/7330] lr: 1.000e-04, eta: 11:47:54, time: 0.660, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0512, loss_cls: 0.2187, acc: 92.3369, loss_bbox: 0.2638, loss_mask: 0.2562, loss: 0.8148 2024-05-29 15:29:10,023 - mmdet - INFO - Epoch [4][4400/7330] lr: 1.000e-04, eta: 11:47:18, time: 0.680, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0517, loss_cls: 0.2248, acc: 91.9819, loss_bbox: 0.2698, loss_mask: 0.2628, loss: 0.8335 2024-05-29 15:29:43,357 - mmdet - INFO - Epoch [4][4450/7330] lr: 1.000e-04, eta: 11:46:41, time: 0.667, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0511, loss_cls: 0.2162, acc: 92.3516, loss_bbox: 0.2585, loss_mask: 0.2605, loss: 0.8102 2024-05-29 15:30:17,202 - mmdet - INFO - Epoch [4][4500/7330] lr: 1.000e-04, eta: 11:46:05, time: 0.677, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0519, loss_cls: 0.2103, acc: 92.5110, loss_bbox: 0.2571, loss_mask: 0.2577, loss: 0.8005 2024-05-29 15:30:53,194 - mmdet - INFO - Epoch [4][4550/7330] lr: 1.000e-04, eta: 11:45:34, time: 0.720, data_time: 0.062, memory: 11628, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0515, loss_cls: 0.2129, acc: 92.5342, loss_bbox: 0.2536, loss_mask: 0.2567, loss: 0.7990 2024-05-29 15:31:29,547 - mmdet - INFO - Epoch [4][4600/7330] lr: 1.000e-04, eta: 11:45:04, time: 0.727, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0522, loss_cls: 0.2190, acc: 92.3752, loss_bbox: 0.2636, loss_mask: 0.2515, loss: 0.8089 2024-05-29 15:32:03,501 - mmdet - INFO - Epoch [4][4650/7330] lr: 1.000e-04, eta: 11:44:28, time: 0.679, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0526, loss_cls: 0.2168, acc: 92.4260, loss_bbox: 0.2596, loss_mask: 0.2571, loss: 0.8125 2024-05-29 15:32:39,771 - mmdet - INFO - Epoch [4][4700/7330] lr: 1.000e-04, eta: 11:43:58, time: 0.726, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0522, loss_cls: 0.2216, acc: 92.1799, loss_bbox: 0.2642, loss_mask: 0.2561, loss: 0.8176 2024-05-29 15:33:16,186 - mmdet - INFO - Epoch [4][4750/7330] lr: 1.000e-04, eta: 11:43:28, time: 0.728, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0514, loss_cls: 0.2199, acc: 92.2495, loss_bbox: 0.2673, loss_mask: 0.2561, loss: 0.8199 2024-05-29 15:33:54,828 - mmdet - INFO - Epoch [4][4800/7330] lr: 1.000e-04, eta: 11:43:03, time: 0.773, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0509, loss_cls: 0.2106, acc: 92.5327, loss_bbox: 0.2551, loss_mask: 0.2531, loss: 0.7941 2024-05-29 15:34:27,506 - mmdet - INFO - Epoch [4][4850/7330] lr: 1.000e-04, eta: 11:42:24, time: 0.653, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0515, loss_cls: 0.2070, acc: 92.7639, loss_bbox: 0.2493, loss_mask: 0.2511, loss: 0.7829 2024-05-29 15:35:01,164 - mmdet - INFO - Epoch [4][4900/7330] lr: 1.000e-04, eta: 11:41:48, time: 0.674, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0506, loss_cls: 0.2068, acc: 92.6306, loss_bbox: 0.2508, loss_mask: 0.2527, loss: 0.7833 2024-05-29 15:35:34,433 - mmdet - INFO - Epoch [4][4950/7330] lr: 1.000e-04, eta: 11:41:11, time: 0.665, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0502, loss_cls: 0.2099, acc: 92.4614, loss_bbox: 0.2544, loss_mask: 0.2527, loss: 0.7909 2024-05-29 15:36:10,467 - mmdet - INFO - Epoch [4][5000/7330] lr: 1.000e-04, eta: 11:40:40, time: 0.721, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0512, loss_cls: 0.2142, acc: 92.4360, loss_bbox: 0.2574, loss_mask: 0.2529, loss: 0.7987 2024-05-29 15:36:43,749 - mmdet - INFO - Epoch [4][5050/7330] lr: 1.000e-04, eta: 11:40:03, time: 0.666, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0498, loss_cls: 0.2116, acc: 92.5413, loss_bbox: 0.2542, loss_mask: 0.2487, loss: 0.7876 2024-05-29 15:37:16,608 - mmdet - INFO - Epoch [4][5100/7330] lr: 1.000e-04, eta: 11:39:25, time: 0.657, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0482, loss_cls: 0.2042, acc: 92.6335, loss_bbox: 0.2563, loss_mask: 0.2539, loss: 0.7849 2024-05-29 15:37:52,063 - mmdet - INFO - Epoch [4][5150/7330] lr: 1.000e-04, eta: 11:38:52, time: 0.709, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0507, loss_cls: 0.2098, acc: 92.5981, loss_bbox: 0.2505, loss_mask: 0.2522, loss: 0.7874 2024-05-29 15:38:25,539 - mmdet - INFO - Epoch [4][5200/7330] lr: 1.000e-04, eta: 11:38:16, time: 0.669, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0491, loss_cls: 0.2079, acc: 92.5789, loss_bbox: 0.2572, loss_mask: 0.2508, loss: 0.7869 2024-05-29 15:38:59,435 - mmdet - INFO - Epoch [4][5250/7330] lr: 1.000e-04, eta: 11:37:40, time: 0.678, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0510, loss_cls: 0.2164, acc: 92.4619, loss_bbox: 0.2544, loss_mask: 0.2564, loss: 0.8023 2024-05-29 15:39:33,093 - mmdet - INFO - Epoch [4][5300/7330] lr: 1.000e-04, eta: 11:37:04, time: 0.673, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0517, loss_cls: 0.2043, acc: 92.7351, loss_bbox: 0.2550, loss_mask: 0.2526, loss: 0.7876 2024-05-29 15:40:06,650 - mmdet - INFO - Epoch [4][5350/7330] lr: 1.000e-04, eta: 11:36:27, time: 0.671, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0530, loss_cls: 0.2143, acc: 92.4089, loss_bbox: 0.2572, loss_mask: 0.2568, loss: 0.8051 2024-05-29 15:40:40,038 - mmdet - INFO - Epoch [4][5400/7330] lr: 1.000e-04, eta: 11:35:50, time: 0.668, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0501, loss_cls: 0.2130, acc: 92.4675, loss_bbox: 0.2504, loss_mask: 0.2530, loss: 0.7907 2024-05-29 15:41:15,244 - mmdet - INFO - Epoch [4][5450/7330] lr: 1.000e-04, eta: 11:35:18, time: 0.704, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0477, loss_cls: 0.2073, acc: 92.6313, loss_bbox: 0.2547, loss_mask: 0.2504, loss: 0.7823 2024-05-29 15:41:51,188 - mmdet - INFO - Epoch [4][5500/7330] lr: 1.000e-04, eta: 11:34:46, time: 0.719, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0482, loss_cls: 0.2085, acc: 92.6567, loss_bbox: 0.2505, loss_mask: 0.2494, loss: 0.7767 2024-05-29 15:42:24,489 - mmdet - INFO - Epoch [4][5550/7330] lr: 1.000e-04, eta: 11:34:09, time: 0.666, data_time: 0.036, memory: 11628, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0497, loss_cls: 0.2129, acc: 92.3911, loss_bbox: 0.2642, loss_mask: 0.2546, loss: 0.8040 2024-05-29 15:43:00,249 - mmdet - INFO - Epoch [4][5600/7330] lr: 1.000e-04, eta: 11:33:38, time: 0.715, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0475, loss_cls: 0.2035, acc: 92.7561, loss_bbox: 0.2481, loss_mask: 0.2525, loss: 0.7739 2024-05-29 15:43:36,111 - mmdet - INFO - Epoch [4][5650/7330] lr: 1.000e-04, eta: 11:33:06, time: 0.717, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0473, loss_cls: 0.2000, acc: 92.9006, loss_bbox: 0.2440, loss_mask: 0.2505, loss: 0.7626 2024-05-29 15:44:14,612 - mmdet - INFO - Epoch [4][5700/7330] lr: 1.000e-04, eta: 11:32:40, time: 0.770, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0494, loss_cls: 0.2146, acc: 92.3181, loss_bbox: 0.2587, loss_mask: 0.2598, loss: 0.8068 2024-05-29 15:44:47,549 - mmdet - INFO - Epoch [4][5750/7330] lr: 1.000e-04, eta: 11:32:03, time: 0.659, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0479, loss_cls: 0.2081, acc: 92.6982, loss_bbox: 0.2505, loss_mask: 0.2533, loss: 0.7820 2024-05-29 15:45:20,936 - mmdet - INFO - Epoch [4][5800/7330] lr: 1.000e-04, eta: 11:31:26, time: 0.668, data_time: 0.069, memory: 11628, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0507, loss_cls: 0.2111, acc: 92.5771, loss_bbox: 0.2562, loss_mask: 0.2507, loss: 0.7920 2024-05-29 15:45:54,349 - mmdet - INFO - Epoch [4][5850/7330] lr: 1.000e-04, eta: 11:30:49, time: 0.668, data_time: 0.035, memory: 11628, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0487, loss_cls: 0.2110, acc: 92.5837, loss_bbox: 0.2551, loss_mask: 0.2509, loss: 0.7890 2024-05-29 15:46:29,709 - mmdet - INFO - Epoch [4][5900/7330] lr: 1.000e-04, eta: 11:30:16, time: 0.707, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0494, loss_cls: 0.2170, acc: 92.3032, loss_bbox: 0.2614, loss_mask: 0.2544, loss: 0.8053 2024-05-29 15:47:03,349 - mmdet - INFO - Epoch [4][5950/7330] lr: 1.000e-04, eta: 11:29:40, time: 0.673, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0495, loss_cls: 0.2118, acc: 92.6426, loss_bbox: 0.2547, loss_mask: 0.2502, loss: 0.7911 2024-05-29 15:47:36,844 - mmdet - INFO - Epoch [4][6000/7330] lr: 1.000e-04, eta: 11:29:04, time: 0.670, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0478, loss_cls: 0.2027, acc: 92.8125, loss_bbox: 0.2431, loss_mask: 0.2530, loss: 0.7730 2024-05-29 15:48:12,375 - mmdet - INFO - Epoch [4][6050/7330] lr: 1.000e-04, eta: 11:28:31, time: 0.711, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0532, loss_cls: 0.2157, acc: 92.3804, loss_bbox: 0.2590, loss_mask: 0.2532, loss: 0.8055 2024-05-29 15:48:45,712 - mmdet - INFO - Epoch [4][6100/7330] lr: 1.000e-04, eta: 11:27:55, time: 0.667, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0498, loss_cls: 0.2115, acc: 92.6826, loss_bbox: 0.2529, loss_mask: 0.2558, loss: 0.7927 2024-05-29 15:49:19,567 - mmdet - INFO - Epoch [4][6150/7330] lr: 1.000e-04, eta: 11:27:19, time: 0.677, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0525, loss_cls: 0.2215, acc: 92.0408, loss_bbox: 0.2646, loss_mask: 0.2633, loss: 0.8290 2024-05-29 15:49:53,572 - mmdet - INFO - Epoch [4][6200/7330] lr: 1.000e-04, eta: 11:26:43, time: 0.680, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0495, loss_cls: 0.2096, acc: 92.5706, loss_bbox: 0.2560, loss_mask: 0.2493, loss: 0.7896 2024-05-29 15:50:26,690 - mmdet - INFO - Epoch [4][6250/7330] lr: 1.000e-04, eta: 11:26:06, time: 0.662, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0484, loss_cls: 0.2044, acc: 92.7881, loss_bbox: 0.2441, loss_mask: 0.2505, loss: 0.7703 2024-05-29 15:51:00,196 - mmdet - INFO - Epoch [4][6300/7330] lr: 1.000e-04, eta: 11:25:30, time: 0.670, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0519, loss_cls: 0.2121, acc: 92.4707, loss_bbox: 0.2527, loss_mask: 0.2556, loss: 0.7985 2024-05-29 15:51:35,898 - mmdet - INFO - Epoch [4][6350/7330] lr: 1.000e-04, eta: 11:24:58, time: 0.714, data_time: 0.036, memory: 11628, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0464, loss_cls: 0.1986, acc: 93.0190, loss_bbox: 0.2405, loss_mask: 0.2435, loss: 0.7521 2024-05-29 15:52:11,812 - mmdet - INFO - Epoch [4][6400/7330] lr: 1.000e-04, eta: 11:24:26, time: 0.718, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0469, loss_cls: 0.2091, acc: 92.7288, loss_bbox: 0.2505, loss_mask: 0.2578, loss: 0.7872 2024-05-29 15:52:45,209 - mmdet - INFO - Epoch [4][6450/7330] lr: 1.000e-04, eta: 11:23:50, time: 0.668, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0468, loss_cls: 0.2103, acc: 92.5012, loss_bbox: 0.2535, loss_mask: 0.2533, loss: 0.7883 2024-05-29 15:53:21,930 - mmdet - INFO - Epoch [4][6500/7330] lr: 1.000e-04, eta: 11:23:20, time: 0.734, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0493, loss_cls: 0.2071, acc: 92.6450, loss_bbox: 0.2527, loss_mask: 0.2475, loss: 0.7800 2024-05-29 15:54:00,118 - mmdet - INFO - Epoch [4][6550/7330] lr: 1.000e-04, eta: 11:22:53, time: 0.764, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0486, loss_cls: 0.2079, acc: 92.6865, loss_bbox: 0.2472, loss_mask: 0.2464, loss: 0.7726 2024-05-29 15:54:38,761 - mmdet - INFO - Epoch [4][6600/7330] lr: 1.000e-04, eta: 11:22:27, time: 0.773, data_time: 0.037, memory: 11628, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0489, loss_cls: 0.2100, acc: 92.5759, loss_bbox: 0.2528, loss_mask: 0.2562, loss: 0.7913 2024-05-29 15:55:12,236 - mmdet - INFO - Epoch [4][6650/7330] lr: 1.000e-04, eta: 11:21:51, time: 0.670, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0485, loss_cls: 0.2116, acc: 92.5034, loss_bbox: 0.2553, loss_mask: 0.2550, loss: 0.7942 2024-05-29 15:55:45,596 - mmdet - INFO - Epoch [4][6700/7330] lr: 1.000e-04, eta: 11:21:14, time: 0.667, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0519, loss_cls: 0.2119, acc: 92.5515, loss_bbox: 0.2552, loss_mask: 0.2558, loss: 0.7997 2024-05-29 15:56:19,250 - mmdet - INFO - Epoch [4][6750/7330] lr: 1.000e-04, eta: 11:20:38, time: 0.673, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0480, loss_cls: 0.2163, acc: 92.3083, loss_bbox: 0.2568, loss_mask: 0.2551, loss: 0.8004 2024-05-29 15:56:55,053 - mmdet - INFO - Epoch [4][6800/7330] lr: 1.000e-04, eta: 11:20:06, time: 0.716, data_time: 0.038, memory: 11628, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0492, loss_cls: 0.2051, acc: 92.9031, loss_bbox: 0.2474, loss_mask: 0.2564, loss: 0.7837 2024-05-29 15:57:28,860 - mmdet - INFO - Epoch [4][6850/7330] lr: 1.000e-04, eta: 11:19:30, time: 0.676, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0499, loss_cls: 0.1982, acc: 92.9089, loss_bbox: 0.2428, loss_mask: 0.2491, loss: 0.7619 2024-05-29 15:58:02,149 - mmdet - INFO - Epoch [4][6900/7330] lr: 1.000e-04, eta: 11:18:53, time: 0.666, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0465, loss_cls: 0.1984, acc: 92.9907, loss_bbox: 0.2444, loss_mask: 0.2485, loss: 0.7602 2024-05-29 15:58:37,709 - mmdet - INFO - Epoch [4][6950/7330] lr: 1.000e-04, eta: 11:18:21, time: 0.711, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0480, loss_cls: 0.1990, acc: 92.9568, loss_bbox: 0.2421, loss_mask: 0.2460, loss: 0.7570 2024-05-29 15:59:11,201 - mmdet - INFO - Epoch [4][7000/7330] lr: 1.000e-04, eta: 11:17:44, time: 0.670, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0482, loss_cls: 0.1981, acc: 92.9250, loss_bbox: 0.2418, loss_mask: 0.2420, loss: 0.7527 2024-05-29 15:59:44,808 - mmdet - INFO - Epoch [4][7050/7330] lr: 1.000e-04, eta: 11:17:08, time: 0.672, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0479, loss_cls: 0.2112, acc: 92.5146, loss_bbox: 0.2589, loss_mask: 0.2513, loss: 0.7924 2024-05-29 16:00:19,558 - mmdet - INFO - Epoch [4][7100/7330] lr: 1.000e-04, eta: 11:16:34, time: 0.695, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0532, loss_cls: 0.2139, acc: 92.4746, loss_bbox: 0.2506, loss_mask: 0.2531, loss: 0.7943 2024-05-29 16:00:53,128 - mmdet - INFO - Epoch [4][7150/7330] lr: 1.000e-04, eta: 11:15:58, time: 0.671, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0495, loss_cls: 0.2133, acc: 92.4607, loss_bbox: 0.2538, loss_mask: 0.2489, loss: 0.7883 2024-05-29 16:01:26,290 - mmdet - INFO - Epoch [4][7200/7330] lr: 1.000e-04, eta: 11:15:21, time: 0.663, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0470, loss_cls: 0.2060, acc: 92.7881, loss_bbox: 0.2444, loss_mask: 0.2497, loss: 0.7692 2024-05-29 16:02:02,041 - mmdet - INFO - Epoch [4][7250/7330] lr: 1.000e-04, eta: 11:14:49, time: 0.715, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0507, loss_cls: 0.2127, acc: 92.5449, loss_bbox: 0.2550, loss_mask: 0.2549, loss: 0.7959 2024-05-29 16:02:38,068 - mmdet - INFO - Epoch [4][7300/7330] lr: 1.000e-04, eta: 11:14:17, time: 0.721, data_time: 0.038, memory: 11628, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0450, loss_cls: 0.2088, acc: 92.7046, loss_bbox: 0.2455, loss_mask: 0.2460, loss: 0.7675 2024-05-29 16:02:58,766 - mmdet - INFO - Saving checkpoint at 4 epochs 2024-05-29 16:04:52,087 - mmdet - INFO - Evaluating bbox... 2024-05-29 16:05:16,405 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.403 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.633 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.442 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.235 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.449 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.532 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.532 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.532 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.342 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.578 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.689 2024-05-29 16:05:16,406 - mmdet - INFO - Evaluating segm... 2024-05-29 16:05:42,498 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.371 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.601 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.390 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.166 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.406 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.570 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.488 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.488 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.488 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.284 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.536 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.667 2024-05-29 16:05:42,884 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 16:05:42,885 - mmdet - INFO - Epoch(val) [4][625] bbox_mAP: 0.4030, bbox_mAP_50: 0.6330, bbox_mAP_75: 0.4420, bbox_mAP_s: 0.2350, bbox_mAP_m: 0.4490, bbox_mAP_l: 0.5480, bbox_mAP_copypaste: 0.403 0.633 0.442 0.235 0.449 0.548, segm_mAP: 0.3710, segm_mAP_50: 0.6010, segm_mAP_75: 0.3900, segm_mAP_s: 0.1660, segm_mAP_m: 0.4060, segm_mAP_l: 0.5700, segm_mAP_copypaste: 0.371 0.601 0.390 0.166 0.406 0.570 2024-05-29 16:06:31,894 - mmdet - INFO - Epoch [5][50/7330] lr: 1.000e-04, eta: 11:13:10, time: 0.980, data_time: 0.122, memory: 11628, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0493, loss_cls: 0.1975, acc: 92.7415, loss_bbox: 0.2488, loss_mask: 0.2455, loss: 0.7613 2024-05-29 16:07:10,018 - mmdet - INFO - Epoch [5][100/7330] lr: 1.000e-04, eta: 11:12:43, time: 0.762, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0472, loss_cls: 0.1953, acc: 92.8799, loss_bbox: 0.2431, loss_mask: 0.2468, loss: 0.7542 2024-05-29 16:07:44,293 - mmdet - INFO - Epoch [5][150/7330] lr: 1.000e-04, eta: 11:12:08, time: 0.685, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0498, loss_cls: 0.2039, acc: 92.6106, loss_bbox: 0.2564, loss_mask: 0.2512, loss: 0.7825 2024-05-29 16:08:20,424 - mmdet - INFO - Epoch [5][200/7330] lr: 1.000e-04, eta: 11:11:37, time: 0.723, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0492, loss_cls: 0.2053, acc: 92.5291, loss_bbox: 0.2547, loss_mask: 0.2427, loss: 0.7736 2024-05-29 16:08:54,362 - mmdet - INFO - Epoch [5][250/7330] lr: 1.000e-04, eta: 11:11:01, time: 0.679, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0455, loss_cls: 0.1933, acc: 93.0530, loss_bbox: 0.2389, loss_mask: 0.2398, loss: 0.7375 2024-05-29 16:09:28,042 - mmdet - INFO - Epoch [5][300/7330] lr: 1.000e-04, eta: 11:10:25, time: 0.674, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0489, loss_cls: 0.1974, acc: 92.9568, loss_bbox: 0.2438, loss_mask: 0.2474, loss: 0.7600 2024-05-29 16:10:01,459 - mmdet - INFO - Epoch [5][350/7330] lr: 1.000e-04, eta: 11:09:48, time: 0.668, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0470, loss_cls: 0.2021, acc: 92.8103, loss_bbox: 0.2434, loss_mask: 0.2484, loss: 0.7617 2024-05-29 16:10:35,991 - mmdet - INFO - Epoch [5][400/7330] lr: 1.000e-04, eta: 11:09:14, time: 0.691, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0497, loss_cls: 0.1988, acc: 92.8643, loss_bbox: 0.2433, loss_mask: 0.2456, loss: 0.7598 2024-05-29 16:11:09,700 - mmdet - INFO - Epoch [5][450/7330] lr: 1.000e-04, eta: 11:08:38, time: 0.674, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0467, loss_cls: 0.1981, acc: 92.9026, loss_bbox: 0.2472, loss_mask: 0.2474, loss: 0.7586 2024-05-29 16:11:42,929 - mmdet - INFO - Epoch [5][500/7330] lr: 1.000e-04, eta: 11:08:01, time: 0.665, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0519, loss_cls: 0.1998, acc: 92.7971, loss_bbox: 0.2473, loss_mask: 0.2458, loss: 0.7671 2024-05-29 16:12:19,118 - mmdet - INFO - Epoch [5][550/7330] lr: 1.000e-04, eta: 11:07:30, time: 0.724, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0474, loss_cls: 0.1879, acc: 93.2876, loss_bbox: 0.2379, loss_mask: 0.2447, loss: 0.7384 2024-05-29 16:12:53,228 - mmdet - INFO - Epoch [5][600/7330] lr: 1.000e-04, eta: 11:06:55, time: 0.682, data_time: 0.064, memory: 11628, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0507, loss_cls: 0.2045, acc: 92.6091, loss_bbox: 0.2517, loss_mask: 0.2528, loss: 0.7813 2024-05-29 16:13:27,217 - mmdet - INFO - Epoch [5][650/7330] lr: 1.000e-04, eta: 11:06:20, time: 0.680, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0503, loss_cls: 0.1967, acc: 92.8799, loss_bbox: 0.2534, loss_mask: 0.2486, loss: 0.7696 2024-05-29 16:14:00,979 - mmdet - INFO - Epoch [5][700/7330] lr: 1.000e-04, eta: 11:05:44, time: 0.675, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0495, loss_cls: 0.2037, acc: 92.6963, loss_bbox: 0.2508, loss_mask: 0.2470, loss: 0.7713 2024-05-29 16:14:34,271 - mmdet - INFO - Epoch [5][750/7330] lr: 1.000e-04, eta: 11:05:07, time: 0.666, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0485, loss_cls: 0.1952, acc: 92.9729, loss_bbox: 0.2452, loss_mask: 0.2481, loss: 0.7562 2024-05-29 16:15:07,969 - mmdet - INFO - Epoch [5][800/7330] lr: 1.000e-04, eta: 11:04:31, time: 0.674, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0465, loss_cls: 0.1991, acc: 92.7737, loss_bbox: 0.2483, loss_mask: 0.2463, loss: 0.7595 2024-05-29 16:15:41,672 - mmdet - INFO - Epoch [5][850/7330] lr: 1.000e-04, eta: 11:03:55, time: 0.674, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0486, loss_cls: 0.1990, acc: 92.7629, loss_bbox: 0.2434, loss_mask: 0.2414, loss: 0.7538 2024-05-29 16:16:21,756 - mmdet - INFO - Epoch [5][900/7330] lr: 1.000e-04, eta: 11:03:31, time: 0.802, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0485, loss_cls: 0.1967, acc: 93.0764, loss_bbox: 0.2439, loss_mask: 0.2458, loss: 0.7555 2024-05-29 16:16:57,903 - mmdet - INFO - Epoch [5][950/7330] lr: 1.000e-04, eta: 11:03:00, time: 0.723, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0518, loss_cls: 0.2032, acc: 92.6570, loss_bbox: 0.2510, loss_mask: 0.2462, loss: 0.7757 2024-05-29 16:17:35,950 - mmdet - INFO - Epoch [5][1000/7330] lr: 1.000e-04, eta: 11:02:32, time: 0.761, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0477, loss_cls: 0.1912, acc: 92.9036, loss_bbox: 0.2450, loss_mask: 0.2444, loss: 0.7501 2024-05-29 16:18:12,342 - mmdet - INFO - Epoch [5][1050/7330] lr: 1.000e-04, eta: 11:02:02, time: 0.728, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0482, loss_cls: 0.2026, acc: 92.7976, loss_bbox: 0.2456, loss_mask: 0.2470, loss: 0.7665 2024-05-29 16:18:49,542 - mmdet - INFO - Epoch [5][1100/7330] lr: 1.000e-04, eta: 11:01:32, time: 0.744, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0480, loss_cls: 0.2017, acc: 92.7959, loss_bbox: 0.2518, loss_mask: 0.2472, loss: 0.7696 2024-05-29 16:19:23,353 - mmdet - INFO - Epoch [5][1150/7330] lr: 1.000e-04, eta: 11:00:56, time: 0.676, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0483, loss_cls: 0.2021, acc: 92.7063, loss_bbox: 0.2550, loss_mask: 0.2478, loss: 0.7744 2024-05-29 16:19:57,303 - mmdet - INFO - Epoch [5][1200/7330] lr: 1.000e-04, eta: 11:00:21, time: 0.679, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0481, loss_cls: 0.2064, acc: 92.7500, loss_bbox: 0.2429, loss_mask: 0.2484, loss: 0.7684 2024-05-29 16:20:30,673 - mmdet - INFO - Epoch [5][1250/7330] lr: 1.000e-04, eta: 10:59:44, time: 0.667, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0462, loss_cls: 0.1943, acc: 93.0720, loss_bbox: 0.2412, loss_mask: 0.2384, loss: 0.7399 2024-05-29 16:21:04,222 - mmdet - INFO - Epoch [5][1300/7330] lr: 1.000e-04, eta: 10:59:08, time: 0.671, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0477, loss_cls: 0.1988, acc: 92.9346, loss_bbox: 0.2487, loss_mask: 0.2471, loss: 0.7640 2024-05-29 16:21:38,522 - mmdet - INFO - Epoch [5][1350/7330] lr: 1.000e-04, eta: 10:58:33, time: 0.686, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0532, loss_cls: 0.2077, acc: 92.5823, loss_bbox: 0.2566, loss_mask: 0.2503, loss: 0.7897 2024-05-29 16:22:14,723 - mmdet - INFO - Epoch [5][1400/7330] lr: 1.000e-04, eta: 10:58:02, time: 0.724, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0475, loss_cls: 0.2007, acc: 92.7812, loss_bbox: 0.2465, loss_mask: 0.2470, loss: 0.7618 2024-05-29 16:22:48,934 - mmdet - INFO - Epoch [5][1450/7330] lr: 1.000e-04, eta: 10:57:27, time: 0.684, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0490, loss_cls: 0.2068, acc: 92.5781, loss_bbox: 0.2492, loss_mask: 0.2464, loss: 0.7729 2024-05-29 16:23:22,551 - mmdet - INFO - Epoch [5][1500/7330] lr: 1.000e-04, eta: 10:56:51, time: 0.672, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0481, loss_cls: 0.2079, acc: 92.5623, loss_bbox: 0.2567, loss_mask: 0.2454, loss: 0.7792 2024-05-29 16:23:55,617 - mmdet - INFO - Epoch [5][1550/7330] lr: 1.000e-04, eta: 10:56:14, time: 0.661, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0471, loss_cls: 0.1951, acc: 93.0010, loss_bbox: 0.2441, loss_mask: 0.2465, loss: 0.7535 2024-05-29 16:24:29,260 - mmdet - INFO - Epoch [5][1600/7330] lr: 1.000e-04, eta: 10:55:38, time: 0.673, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0472, loss_cls: 0.2014, acc: 92.8071, loss_bbox: 0.2449, loss_mask: 0.2444, loss: 0.7587 2024-05-29 16:25:03,244 - mmdet - INFO - Epoch [5][1650/7330] lr: 1.000e-04, eta: 10:55:02, time: 0.679, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0488, loss_cls: 0.2016, acc: 92.6990, loss_bbox: 0.2485, loss_mask: 0.2446, loss: 0.7652 2024-05-29 16:25:36,985 - mmdet - INFO - Epoch [5][1700/7330] lr: 1.000e-04, eta: 10:54:26, time: 0.675, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0487, loss_cls: 0.1989, acc: 92.9670, loss_bbox: 0.2415, loss_mask: 0.2475, loss: 0.7585 2024-05-29 16:26:11,333 - mmdet - INFO - Epoch [5][1750/7330] lr: 1.000e-04, eta: 10:53:52, time: 0.687, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0480, loss_cls: 0.1940, acc: 93.1077, loss_bbox: 0.2469, loss_mask: 0.2491, loss: 0.7591 2024-05-29 16:26:52,627 - mmdet - INFO - Epoch [5][1800/7330] lr: 1.000e-04, eta: 10:53:30, time: 0.826, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0490, loss_cls: 0.2027, acc: 92.7461, loss_bbox: 0.2445, loss_mask: 0.2436, loss: 0.7610 2024-05-29 16:27:25,733 - mmdet - INFO - Epoch [5][1850/7330] lr: 1.000e-04, eta: 10:52:53, time: 0.662, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0469, loss_cls: 0.1945, acc: 93.1401, loss_bbox: 0.2405, loss_mask: 0.2414, loss: 0.7431 2024-05-29 16:28:07,234 - mmdet - INFO - Epoch [5][1900/7330] lr: 1.000e-04, eta: 10:52:31, time: 0.830, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0471, loss_cls: 0.1925, acc: 93.1196, loss_bbox: 0.2410, loss_mask: 0.2463, loss: 0.7473 2024-05-29 16:28:45,228 - mmdet - INFO - Epoch [5][1950/7330] lr: 1.000e-04, eta: 10:52:03, time: 0.760, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0506, loss_cls: 0.2028, acc: 92.6177, loss_bbox: 0.2536, loss_mask: 0.2504, loss: 0.7787 2024-05-29 16:29:18,083 - mmdet - INFO - Epoch [5][2000/7330] lr: 1.000e-04, eta: 10:51:25, time: 0.657, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0441, loss_cls: 0.1984, acc: 92.9973, loss_bbox: 0.2389, loss_mask: 0.2492, loss: 0.7505 2024-05-29 16:29:51,247 - mmdet - INFO - Epoch [5][2050/7330] lr: 1.000e-04, eta: 10:50:48, time: 0.663, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0473, loss_cls: 0.2028, acc: 92.7407, loss_bbox: 0.2418, loss_mask: 0.2461, loss: 0.7591 2024-05-29 16:30:25,359 - mmdet - INFO - Epoch [5][2100/7330] lr: 1.000e-04, eta: 10:50:13, time: 0.682, data_time: 0.063, memory: 11628, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0484, loss_cls: 0.2030, acc: 92.6707, loss_bbox: 0.2539, loss_mask: 0.2453, loss: 0.7712 2024-05-29 16:30:59,308 - mmdet - INFO - Epoch [5][2150/7330] lr: 1.000e-04, eta: 10:49:38, time: 0.679, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0483, loss_cls: 0.2055, acc: 92.7136, loss_bbox: 0.2508, loss_mask: 0.2508, loss: 0.7774 2024-05-29 16:31:33,352 - mmdet - INFO - Epoch [5][2200/7330] lr: 1.000e-04, eta: 10:49:02, time: 0.681, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0489, loss_cls: 0.2064, acc: 92.6238, loss_bbox: 0.2485, loss_mask: 0.2448, loss: 0.7697 2024-05-29 16:32:06,792 - mmdet - INFO - Epoch [5][2250/7330] lr: 1.000e-04, eta: 10:48:26, time: 0.669, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0471, loss_cls: 0.2009, acc: 92.8806, loss_bbox: 0.2435, loss_mask: 0.2390, loss: 0.7509 2024-05-29 16:32:43,509 - mmdet - INFO - Epoch [5][2300/7330] lr: 1.000e-04, eta: 10:47:55, time: 0.734, data_time: 0.070, memory: 11628, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0513, loss_cls: 0.2017, acc: 92.7712, loss_bbox: 0.2499, loss_mask: 0.2444, loss: 0.7709 2024-05-29 16:33:17,128 - mmdet - INFO - Epoch [5][2350/7330] lr: 1.000e-04, eta: 10:47:19, time: 0.672, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0466, loss_cls: 0.1978, acc: 92.8513, loss_bbox: 0.2461, loss_mask: 0.2478, loss: 0.7588 2024-05-29 16:33:50,927 - mmdet - INFO - Epoch [5][2400/7330] lr: 1.000e-04, eta: 10:46:44, time: 0.676, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0497, loss_cls: 0.1984, acc: 92.8616, loss_bbox: 0.2444, loss_mask: 0.2468, loss: 0.7606 2024-05-29 16:34:24,895 - mmdet - INFO - Epoch [5][2450/7330] lr: 1.000e-04, eta: 10:46:08, time: 0.680, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0480, loss_cls: 0.2096, acc: 92.6096, loss_bbox: 0.2526, loss_mask: 0.2477, loss: 0.7785 2024-05-29 16:34:58,386 - mmdet - INFO - Epoch [5][2500/7330] lr: 1.000e-04, eta: 10:45:32, time: 0.670, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0471, loss_cls: 0.1903, acc: 93.2190, loss_bbox: 0.2347, loss_mask: 0.2387, loss: 0.7337 2024-05-29 16:35:32,039 - mmdet - INFO - Epoch [5][2550/7330] lr: 1.000e-04, eta: 10:44:56, time: 0.673, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0476, loss_cls: 0.2026, acc: 92.7556, loss_bbox: 0.2491, loss_mask: 0.2469, loss: 0.7689 2024-05-29 16:36:05,827 - mmdet - INFO - Epoch [5][2600/7330] lr: 1.000e-04, eta: 10:44:20, time: 0.676, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0498, loss_cls: 0.2011, acc: 92.7634, loss_bbox: 0.2458, loss_mask: 0.2410, loss: 0.7596 2024-05-29 16:36:39,666 - mmdet - INFO - Epoch [5][2650/7330] lr: 1.000e-04, eta: 10:43:44, time: 0.677, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0497, loss_cls: 0.1992, acc: 92.8701, loss_bbox: 0.2464, loss_mask: 0.2490, loss: 0.7665 2024-05-29 16:37:22,341 - mmdet - INFO - Epoch [5][2700/7330] lr: 1.000e-04, eta: 10:43:24, time: 0.853, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0462, loss_cls: 0.1949, acc: 93.0742, loss_bbox: 0.2400, loss_mask: 0.2440, loss: 0.7477 2024-05-29 16:37:55,876 - mmdet - INFO - Epoch [5][2750/7330] lr: 1.000e-04, eta: 10:42:48, time: 0.671, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0483, loss_cls: 0.2002, acc: 92.8787, loss_bbox: 0.2439, loss_mask: 0.2452, loss: 0.7595 2024-05-29 16:38:37,093 - mmdet - INFO - Epoch [5][2800/7330] lr: 1.000e-04, eta: 10:42:25, time: 0.824, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0478, loss_cls: 0.2040, acc: 92.7168, loss_bbox: 0.2503, loss_mask: 0.2472, loss: 0.7717 2024-05-29 16:39:13,712 - mmdet - INFO - Epoch [5][2850/7330] lr: 1.000e-04, eta: 10:41:54, time: 0.732, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0501, loss_cls: 0.2054, acc: 92.5803, loss_bbox: 0.2566, loss_mask: 0.2539, loss: 0.7886 2024-05-29 16:39:46,834 - mmdet - INFO - Epoch [5][2900/7330] lr: 1.000e-04, eta: 10:41:17, time: 0.662, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0451, loss_cls: 0.1910, acc: 93.1287, loss_bbox: 0.2361, loss_mask: 0.2390, loss: 0.7310 2024-05-29 16:40:20,486 - mmdet - INFO - Epoch [5][2950/7330] lr: 1.000e-04, eta: 10:40:41, time: 0.673, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0499, loss_cls: 0.2009, acc: 92.7373, loss_bbox: 0.2473, loss_mask: 0.2544, loss: 0.7750 2024-05-29 16:40:54,540 - mmdet - INFO - Epoch [5][3000/7330] lr: 1.000e-04, eta: 10:40:06, time: 0.681, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0472, loss_cls: 0.2120, acc: 92.3066, loss_bbox: 0.2579, loss_mask: 0.2511, loss: 0.7893 2024-05-29 16:41:27,804 - mmdet - INFO - Epoch [5][3050/7330] lr: 1.000e-04, eta: 10:39:29, time: 0.665, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0472, loss_cls: 0.2081, acc: 92.5083, loss_bbox: 0.2567, loss_mask: 0.2458, loss: 0.7801 2024-05-29 16:42:01,151 - mmdet - INFO - Epoch [5][3100/7330] lr: 1.000e-04, eta: 10:38:53, time: 0.667, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0465, loss_cls: 0.1934, acc: 93.0090, loss_bbox: 0.2391, loss_mask: 0.2457, loss: 0.7460 2024-05-29 16:42:34,614 - mmdet - INFO - Epoch [5][3150/7330] lr: 1.000e-04, eta: 10:38:17, time: 0.669, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0498, loss_cls: 0.2033, acc: 92.9478, loss_bbox: 0.2435, loss_mask: 0.2490, loss: 0.7676 2024-05-29 16:43:10,351 - mmdet - INFO - Epoch [5][3200/7330] lr: 1.000e-04, eta: 10:37:44, time: 0.715, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0495, loss_cls: 0.1960, acc: 92.8855, loss_bbox: 0.2490, loss_mask: 0.2468, loss: 0.7648 2024-05-29 16:43:43,770 - mmdet - INFO - Epoch [5][3250/7330] lr: 1.000e-04, eta: 10:37:08, time: 0.668, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0460, loss_cls: 0.1992, acc: 92.7727, loss_bbox: 0.2469, loss_mask: 0.2465, loss: 0.7597 2024-05-29 16:44:16,999 - mmdet - INFO - Epoch [5][3300/7330] lr: 1.000e-04, eta: 10:36:31, time: 0.665, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0448, loss_cls: 0.1919, acc: 93.1167, loss_bbox: 0.2345, loss_mask: 0.2430, loss: 0.7344 2024-05-29 16:44:50,732 - mmdet - INFO - Epoch [5][3350/7330] lr: 1.000e-04, eta: 10:35:55, time: 0.675, data_time: 0.064, memory: 11628, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0468, loss_cls: 0.1979, acc: 92.9731, loss_bbox: 0.2357, loss_mask: 0.2398, loss: 0.7405 2024-05-29 16:45:23,830 - mmdet - INFO - Epoch [5][3400/7330] lr: 1.000e-04, eta: 10:35:18, time: 0.662, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0465, loss_cls: 0.1938, acc: 93.0732, loss_bbox: 0.2387, loss_mask: 0.2424, loss: 0.7434 2024-05-29 16:45:57,579 - mmdet - INFO - Epoch [5][3450/7330] lr: 1.000e-04, eta: 10:34:43, time: 0.675, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0490, loss_cls: 0.2038, acc: 92.7468, loss_bbox: 0.2497, loss_mask: 0.2549, loss: 0.7787 2024-05-29 16:46:31,197 - mmdet - INFO - Epoch [5][3500/7330] lr: 1.000e-04, eta: 10:34:07, time: 0.672, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0489, loss_cls: 0.2031, acc: 92.7285, loss_bbox: 0.2502, loss_mask: 0.2489, loss: 0.7727 2024-05-29 16:47:04,624 - mmdet - INFO - Epoch [5][3550/7330] lr: 1.000e-04, eta: 10:33:30, time: 0.669, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0479, loss_cls: 0.1958, acc: 93.0264, loss_bbox: 0.2423, loss_mask: 0.2432, loss: 0.7501 2024-05-29 16:47:45,339 - mmdet - INFO - Epoch [5][3600/7330] lr: 1.000e-04, eta: 10:33:06, time: 0.814, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0505, loss_cls: 0.1905, acc: 93.0630, loss_bbox: 0.2383, loss_mask: 0.2417, loss: 0.7409 2024-05-29 16:48:24,022 - mmdet - INFO - Epoch [5][3650/7330] lr: 1.000e-04, eta: 10:32:39, time: 0.774, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0460, loss_cls: 0.1989, acc: 92.8943, loss_bbox: 0.2380, loss_mask: 0.2437, loss: 0.7470 2024-05-29 16:48:59,843 - mmdet - INFO - Epoch [5][3700/7330] lr: 1.000e-04, eta: 10:32:06, time: 0.716, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0491, loss_cls: 0.2015, acc: 92.8127, loss_bbox: 0.2462, loss_mask: 0.2526, loss: 0.7713 2024-05-29 16:49:36,604 - mmdet - INFO - Epoch [5][3750/7330] lr: 1.000e-04, eta: 10:31:36, time: 0.735, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0489, loss_cls: 0.2055, acc: 92.6338, loss_bbox: 0.2531, loss_mask: 0.2492, loss: 0.7796 2024-05-29 16:50:09,895 - mmdet - INFO - Epoch [5][3800/7330] lr: 1.000e-04, eta: 10:30:59, time: 0.666, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0513, loss_cls: 0.2101, acc: 92.4670, loss_bbox: 0.2545, loss_mask: 0.2530, loss: 0.7887 2024-05-29 16:50:43,775 - mmdet - INFO - Epoch [5][3850/7330] lr: 1.000e-04, eta: 10:30:24, time: 0.678, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0479, loss_cls: 0.2033, acc: 92.6282, loss_bbox: 0.2482, loss_mask: 0.2409, loss: 0.7603 2024-05-29 16:51:17,703 - mmdet - INFO - Epoch [5][3900/7330] lr: 1.000e-04, eta: 10:29:48, time: 0.679, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0488, loss_cls: 0.2029, acc: 92.7205, loss_bbox: 0.2475, loss_mask: 0.2460, loss: 0.7658 2024-05-29 16:51:51,342 - mmdet - INFO - Epoch [5][3950/7330] lr: 1.000e-04, eta: 10:29:12, time: 0.673, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0457, loss_cls: 0.1968, acc: 92.8271, loss_bbox: 0.2460, loss_mask: 0.2502, loss: 0.7599 2024-05-29 16:52:24,633 - mmdet - INFO - Epoch [5][4000/7330] lr: 1.000e-04, eta: 10:28:36, time: 0.666, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0471, loss_cls: 0.2000, acc: 92.8906, loss_bbox: 0.2388, loss_mask: 0.2415, loss: 0.7479 2024-05-29 16:52:57,660 - mmdet - INFO - Epoch [5][4050/7330] lr: 1.000e-04, eta: 10:27:59, time: 0.661, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0461, loss_cls: 0.1908, acc: 93.1033, loss_bbox: 0.2411, loss_mask: 0.2413, loss: 0.7401 2024-05-29 16:53:33,548 - mmdet - INFO - Epoch [5][4100/7330] lr: 1.000e-04, eta: 10:27:26, time: 0.718, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0467, loss_cls: 0.1962, acc: 92.9414, loss_bbox: 0.2438, loss_mask: 0.2424, loss: 0.7490 2024-05-29 16:54:07,738 - mmdet - INFO - Epoch [5][4150/7330] lr: 1.000e-04, eta: 10:26:51, time: 0.684, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0459, loss_cls: 0.1970, acc: 92.8608, loss_bbox: 0.2468, loss_mask: 0.2440, loss: 0.7557 2024-05-29 16:54:41,372 - mmdet - INFO - Epoch [5][4200/7330] lr: 1.000e-04, eta: 10:26:15, time: 0.673, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0477, loss_cls: 0.1971, acc: 92.9111, loss_bbox: 0.2466, loss_mask: 0.2475, loss: 0.7602 2024-05-29 16:55:14,508 - mmdet - INFO - Epoch [5][4250/7330] lr: 1.000e-04, eta: 10:25:39, time: 0.663, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0508, loss_cls: 0.2040, acc: 92.7029, loss_bbox: 0.2456, loss_mask: 0.2434, loss: 0.7674 2024-05-29 16:55:47,973 - mmdet - INFO - Epoch [5][4300/7330] lr: 1.000e-04, eta: 10:25:02, time: 0.669, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0488, loss_cls: 0.2039, acc: 92.8433, loss_bbox: 0.2445, loss_mask: 0.2452, loss: 0.7653 2024-05-29 16:56:20,874 - mmdet - INFO - Epoch [5][4350/7330] lr: 1.000e-04, eta: 10:24:25, time: 0.658, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0504, loss_cls: 0.2126, acc: 92.3977, loss_bbox: 0.2601, loss_mask: 0.2526, loss: 0.7971 2024-05-29 16:56:53,001 - mmdet - INFO - Epoch [5][4400/7330] lr: 1.000e-04, eta: 10:23:47, time: 0.642, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0439, loss_cls: 0.1810, acc: 93.4856, loss_bbox: 0.2214, loss_mask: 0.2395, loss: 0.7058 2024-05-29 16:57:28,540 - mmdet - INFO - Epoch [5][4450/7330] lr: 1.000e-04, eta: 10:23:14, time: 0.711, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0491, loss_cls: 0.2042, acc: 92.7512, loss_bbox: 0.2489, loss_mask: 0.2538, loss: 0.7791 2024-05-29 16:58:06,510 - mmdet - INFO - Epoch [5][4500/7330] lr: 1.000e-04, eta: 10:22:45, time: 0.759, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0488, loss_cls: 0.2009, acc: 92.8496, loss_bbox: 0.2418, loss_mask: 0.2491, loss: 0.7626 2024-05-29 16:58:44,251 - mmdet - INFO - Epoch [5][4550/7330] lr: 1.000e-04, eta: 10:22:16, time: 0.755, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0477, loss_cls: 0.1985, acc: 92.8792, loss_bbox: 0.2404, loss_mask: 0.2382, loss: 0.7464 2024-05-29 16:59:19,306 - mmdet - INFO - Epoch [5][4600/7330] lr: 1.000e-04, eta: 10:21:42, time: 0.701, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0477, loss_cls: 0.1985, acc: 92.7598, loss_bbox: 0.2500, loss_mask: 0.2494, loss: 0.7658 2024-05-29 16:59:55,239 - mmdet - INFO - Epoch [5][4650/7330] lr: 1.000e-04, eta: 10:21:10, time: 0.719, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0467, loss_cls: 0.2021, acc: 92.8782, loss_bbox: 0.2428, loss_mask: 0.2402, loss: 0.7532 2024-05-29 17:00:28,388 - mmdet - INFO - Epoch [5][4700/7330] lr: 1.000e-04, eta: 10:20:33, time: 0.663, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0460, loss_cls: 0.1966, acc: 92.8999, loss_bbox: 0.2418, loss_mask: 0.2440, loss: 0.7496 2024-05-29 17:01:01,868 - mmdet - INFO - Epoch [5][4750/7330] lr: 1.000e-04, eta: 10:19:57, time: 0.670, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0484, loss_cls: 0.2042, acc: 92.5447, loss_bbox: 0.2524, loss_mask: 0.2496, loss: 0.7772 2024-05-29 17:01:35,486 - mmdet - INFO - Epoch [5][4800/7330] lr: 1.000e-04, eta: 10:19:21, time: 0.672, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0488, loss_cls: 0.1945, acc: 92.9834, loss_bbox: 0.2373, loss_mask: 0.2388, loss: 0.7424 2024-05-29 17:02:08,877 - mmdet - INFO - Epoch [5][4850/7330] lr: 1.000e-04, eta: 10:18:45, time: 0.668, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0466, loss_cls: 0.2008, acc: 92.7754, loss_bbox: 0.2434, loss_mask: 0.2512, loss: 0.7625 2024-05-29 17:02:42,107 - mmdet - INFO - Epoch [5][4900/7330] lr: 1.000e-04, eta: 10:18:08, time: 0.665, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0468, loss_cls: 0.1962, acc: 92.9460, loss_bbox: 0.2391, loss_mask: 0.2424, loss: 0.7459 2024-05-29 17:03:15,496 - mmdet - INFO - Epoch [5][4950/7330] lr: 1.000e-04, eta: 10:17:32, time: 0.668, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0480, loss_cls: 0.1994, acc: 92.9753, loss_bbox: 0.2447, loss_mask: 0.2461, loss: 0.7596 2024-05-29 17:03:50,617 - mmdet - INFO - Epoch [5][5000/7330] lr: 1.000e-04, eta: 10:16:59, time: 0.702, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0486, loss_cls: 0.2033, acc: 92.7014, loss_bbox: 0.2535, loss_mask: 0.2461, loss: 0.7734 2024-05-29 17:04:23,407 - mmdet - INFO - Epoch [5][5050/7330] lr: 1.000e-04, eta: 10:16:21, time: 0.656, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0483, loss_cls: 0.2082, acc: 92.6062, loss_bbox: 0.2527, loss_mask: 0.2495, loss: 0.7814 2024-05-29 17:04:57,192 - mmdet - INFO - Epoch [5][5100/7330] lr: 1.000e-04, eta: 10:15:46, time: 0.676, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0498, loss_cls: 0.2076, acc: 92.4556, loss_bbox: 0.2536, loss_mask: 0.2458, loss: 0.7800 2024-05-29 17:05:31,037 - mmdet - INFO - Epoch [5][5150/7330] lr: 1.000e-04, eta: 10:15:10, time: 0.677, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0463, loss_cls: 0.1991, acc: 92.7566, loss_bbox: 0.2405, loss_mask: 0.2469, loss: 0.7538 2024-05-29 17:06:04,069 - mmdet - INFO - Epoch [5][5200/7330] lr: 1.000e-04, eta: 10:14:33, time: 0.661, data_time: 0.066, memory: 11628, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0490, loss_cls: 0.1960, acc: 92.7998, loss_bbox: 0.2449, loss_mask: 0.2418, loss: 0.7526 2024-05-29 17:06:37,458 - mmdet - INFO - Epoch [5][5250/7330] lr: 1.000e-04, eta: 10:13:57, time: 0.668, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0495, loss_cls: 0.2081, acc: 92.5005, loss_bbox: 0.2526, loss_mask: 0.2472, loss: 0.7809 2024-05-29 17:07:11,154 - mmdet - INFO - Epoch [5][5300/7330] lr: 1.000e-04, eta: 10:13:22, time: 0.674, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0470, loss_cls: 0.2045, acc: 92.7200, loss_bbox: 0.2492, loss_mask: 0.2418, loss: 0.7619 2024-05-29 17:07:48,233 - mmdet - INFO - Epoch [5][5350/7330] lr: 1.000e-04, eta: 10:12:51, time: 0.742, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0463, loss_cls: 0.1967, acc: 92.8982, loss_bbox: 0.2413, loss_mask: 0.2411, loss: 0.7472 2024-05-29 17:08:23,426 - mmdet - INFO - Epoch [5][5400/7330] lr: 1.000e-04, eta: 10:12:18, time: 0.704, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0467, loss_cls: 0.1912, acc: 93.1418, loss_bbox: 0.2362, loss_mask: 0.2415, loss: 0.7395 2024-05-29 17:09:01,304 - mmdet - INFO - Epoch [5][5450/7330] lr: 1.000e-04, eta: 10:11:48, time: 0.758, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0484, loss_cls: 0.1942, acc: 93.0210, loss_bbox: 0.2403, loss_mask: 0.2478, loss: 0.7513 2024-05-29 17:09:36,904 - mmdet - INFO - Epoch [5][5500/7330] lr: 1.000e-04, eta: 10:11:15, time: 0.712, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0478, loss_cls: 0.2077, acc: 92.6577, loss_bbox: 0.2512, loss_mask: 0.2479, loss: 0.7760 2024-05-29 17:10:13,020 - mmdet - INFO - Epoch [5][5550/7330] lr: 1.000e-04, eta: 10:10:43, time: 0.722, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0471, loss_cls: 0.1977, acc: 92.9036, loss_bbox: 0.2401, loss_mask: 0.2408, loss: 0.7472 2024-05-29 17:10:46,262 - mmdet - INFO - Epoch [5][5600/7330] lr: 1.000e-04, eta: 10:10:07, time: 0.665, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0456, loss_cls: 0.1960, acc: 92.8096, loss_bbox: 0.2452, loss_mask: 0.2479, loss: 0.7546 2024-05-29 17:11:19,551 - mmdet - INFO - Epoch [5][5650/7330] lr: 1.000e-04, eta: 10:09:30, time: 0.665, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0483, loss_cls: 0.1944, acc: 93.1165, loss_bbox: 0.2419, loss_mask: 0.2462, loss: 0.7527 2024-05-29 17:11:52,858 - mmdet - INFO - Epoch [5][5700/7330] lr: 1.000e-04, eta: 10:08:54, time: 0.666, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0465, loss_cls: 0.2023, acc: 92.7554, loss_bbox: 0.2393, loss_mask: 0.2417, loss: 0.7530 2024-05-29 17:12:25,965 - mmdet - INFO - Epoch [5][5750/7330] lr: 1.000e-04, eta: 10:08:18, time: 0.662, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0477, loss_cls: 0.1989, acc: 92.8933, loss_bbox: 0.2445, loss_mask: 0.2458, loss: 0.7565 2024-05-29 17:12:59,395 - mmdet - INFO - Epoch [5][5800/7330] lr: 1.000e-04, eta: 10:07:41, time: 0.669, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0516, loss_cls: 0.2004, acc: 92.7844, loss_bbox: 0.2489, loss_mask: 0.2444, loss: 0.7672 2024-05-29 17:13:35,900 - mmdet - INFO - Epoch [5][5850/7330] lr: 1.000e-04, eta: 10:07:10, time: 0.730, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0502, loss_cls: 0.2046, acc: 92.4438, loss_bbox: 0.2595, loss_mask: 0.2415, loss: 0.7776 2024-05-29 17:14:09,367 - mmdet - INFO - Epoch [5][5900/7330] lr: 1.000e-04, eta: 10:06:34, time: 0.669, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0488, loss_cls: 0.2012, acc: 92.7163, loss_bbox: 0.2500, loss_mask: 0.2430, loss: 0.7642 2024-05-29 17:14:42,738 - mmdet - INFO - Epoch [5][5950/7330] lr: 1.000e-04, eta: 10:05:58, time: 0.667, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0470, loss_cls: 0.1991, acc: 92.8184, loss_bbox: 0.2449, loss_mask: 0.2409, loss: 0.7518 2024-05-29 17:15:15,615 - mmdet - INFO - Epoch [5][6000/7330] lr: 1.000e-04, eta: 10:05:21, time: 0.658, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0472, loss_cls: 0.1969, acc: 92.9500, loss_bbox: 0.2405, loss_mask: 0.2455, loss: 0.7518 2024-05-29 17:15:49,077 - mmdet - INFO - Epoch [5][6050/7330] lr: 1.000e-04, eta: 10:04:45, time: 0.669, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0476, loss_cls: 0.1976, acc: 92.9524, loss_bbox: 0.2454, loss_mask: 0.2438, loss: 0.7549 2024-05-29 17:16:22,575 - mmdet - INFO - Epoch [5][6100/7330] lr: 1.000e-04, eta: 10:04:09, time: 0.670, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0447, loss_cls: 0.1964, acc: 92.9385, loss_bbox: 0.2402, loss_mask: 0.2456, loss: 0.7458 2024-05-29 17:16:55,698 - mmdet - INFO - Epoch [5][6150/7330] lr: 1.000e-04, eta: 10:03:32, time: 0.662, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0454, loss_cls: 0.2017, acc: 92.7976, loss_bbox: 0.2424, loss_mask: 0.2436, loss: 0.7547 2024-05-29 17:17:28,925 - mmdet - INFO - Epoch [5][6200/7330] lr: 1.000e-04, eta: 10:02:56, time: 0.665, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0460, loss_cls: 0.1922, acc: 93.1763, loss_bbox: 0.2376, loss_mask: 0.2394, loss: 0.7353 2024-05-29 17:18:06,361 - mmdet - INFO - Epoch [5][6250/7330] lr: 1.000e-04, eta: 10:02:26, time: 0.749, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0445, loss_cls: 0.2002, acc: 92.9993, loss_bbox: 0.2370, loss_mask: 0.2391, loss: 0.7416 2024-05-29 17:18:41,884 - mmdet - INFO - Epoch [5][6300/7330] lr: 1.000e-04, eta: 10:01:53, time: 0.710, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0498, loss_cls: 0.2099, acc: 92.5493, loss_bbox: 0.2561, loss_mask: 0.2526, loss: 0.7917 2024-05-29 17:19:20,396 - mmdet - INFO - Epoch [5][6350/7330] lr: 1.000e-04, eta: 10:01:24, time: 0.770, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0463, loss_cls: 0.1956, acc: 92.9375, loss_bbox: 0.2427, loss_mask: 0.2400, loss: 0.7455 2024-05-29 17:19:56,826 - mmdet - INFO - Epoch [5][6400/7330] lr: 1.000e-04, eta: 10:00:52, time: 0.729, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0461, loss_cls: 0.1972, acc: 92.9895, loss_bbox: 0.2415, loss_mask: 0.2461, loss: 0.7518 2024-05-29 17:20:32,993 - mmdet - INFO - Epoch [5][6450/7330] lr: 1.000e-04, eta: 10:00:20, time: 0.723, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0484, loss_cls: 0.2058, acc: 92.7141, loss_bbox: 0.2479, loss_mask: 0.2471, loss: 0.7715 2024-05-29 17:21:06,466 - mmdet - INFO - Epoch [5][6500/7330] lr: 1.000e-04, eta: 9:59:44, time: 0.669, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0457, loss_cls: 0.1946, acc: 92.9807, loss_bbox: 0.2434, loss_mask: 0.2397, loss: 0.7434 2024-05-29 17:21:40,053 - mmdet - INFO - Epoch [5][6550/7330] lr: 1.000e-04, eta: 9:59:08, time: 0.672, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0458, loss_cls: 0.2035, acc: 92.7749, loss_bbox: 0.2446, loss_mask: 0.2417, loss: 0.7567 2024-05-29 17:22:13,097 - mmdet - INFO - Epoch [5][6600/7330] lr: 1.000e-04, eta: 9:58:32, time: 0.661, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0490, loss_cls: 0.2078, acc: 92.5164, loss_bbox: 0.2565, loss_mask: 0.2474, loss: 0.7839 2024-05-29 17:22:46,677 - mmdet - INFO - Epoch [5][6650/7330] lr: 1.000e-04, eta: 9:57:56, time: 0.672, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0497, loss_cls: 0.1996, acc: 92.8662, loss_bbox: 0.2420, loss_mask: 0.2421, loss: 0.7560 2024-05-29 17:23:19,952 - mmdet - INFO - Epoch [5][6700/7330] lr: 1.000e-04, eta: 9:57:20, time: 0.666, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0452, loss_cls: 0.1898, acc: 93.2629, loss_bbox: 0.2309, loss_mask: 0.2369, loss: 0.7232 2024-05-29 17:23:55,208 - mmdet - INFO - Epoch [5][6750/7330] lr: 1.000e-04, eta: 9:56:46, time: 0.705, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0472, loss_cls: 0.2010, acc: 92.9009, loss_bbox: 0.2441, loss_mask: 0.2400, loss: 0.7524 2024-05-29 17:24:28,453 - mmdet - INFO - Epoch [5][6800/7330] lr: 1.000e-04, eta: 9:56:10, time: 0.665, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0440, loss_cls: 0.2009, acc: 92.8386, loss_bbox: 0.2449, loss_mask: 0.2461, loss: 0.7562 2024-05-29 17:25:01,567 - mmdet - INFO - Epoch [5][6850/7330] lr: 1.000e-04, eta: 9:55:33, time: 0.662, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0455, loss_cls: 0.2041, acc: 92.7354, loss_bbox: 0.2430, loss_mask: 0.2452, loss: 0.7592 2024-05-29 17:25:34,504 - mmdet - INFO - Epoch [5][6900/7330] lr: 1.000e-04, eta: 9:54:57, time: 0.659, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0447, loss_cls: 0.1952, acc: 93.0325, loss_bbox: 0.2375, loss_mask: 0.2464, loss: 0.7435 2024-05-29 17:26:08,039 - mmdet - INFO - Epoch [5][6950/7330] lr: 1.000e-04, eta: 9:54:21, time: 0.670, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0459, loss_cls: 0.2015, acc: 92.7844, loss_bbox: 0.2446, loss_mask: 0.2475, loss: 0.7602 2024-05-29 17:26:41,787 - mmdet - INFO - Epoch [5][7000/7330] lr: 1.000e-04, eta: 9:53:45, time: 0.675, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0464, loss_cls: 0.1939, acc: 93.0139, loss_bbox: 0.2375, loss_mask: 0.2420, loss: 0.7409 2024-05-29 17:27:15,165 - mmdet - INFO - Epoch [5][7050/7330] lr: 1.000e-04, eta: 9:53:09, time: 0.668, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0454, loss_cls: 0.1964, acc: 92.9402, loss_bbox: 0.2399, loss_mask: 0.2494, loss: 0.7502 2024-05-29 17:27:48,088 - mmdet - INFO - Epoch [5][7100/7330] lr: 1.000e-04, eta: 9:52:33, time: 0.658, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0477, loss_cls: 0.2050, acc: 92.7725, loss_bbox: 0.2455, loss_mask: 0.2432, loss: 0.7628 2024-05-29 17:28:25,629 - mmdet - INFO - Epoch [5][7150/7330] lr: 1.000e-04, eta: 9:52:02, time: 0.751, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0469, loss_cls: 0.2031, acc: 92.6606, loss_bbox: 0.2473, loss_mask: 0.2485, loss: 0.7683 2024-05-29 17:29:00,707 - mmdet - INFO - Epoch [5][7200/7330] lr: 1.000e-04, eta: 9:51:29, time: 0.702, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0442, loss_cls: 0.2018, acc: 92.8877, loss_bbox: 0.2443, loss_mask: 0.2463, loss: 0.7567 2024-05-29 17:29:41,076 - mmdet - INFO - Epoch [5][7250/7330] lr: 1.000e-04, eta: 9:50:59, time: 0.762, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0476, loss_cls: 0.1969, acc: 92.9001, loss_bbox: 0.2408, loss_mask: 0.2451, loss: 0.7525 2024-05-29 17:30:14,465 - mmdet - INFO - Epoch [5][7300/7330] lr: 1.000e-04, eta: 9:50:26, time: 0.713, data_time: 0.089, memory: 11628, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0467, loss_cls: 0.2001, acc: 92.7639, loss_bbox: 0.2440, loss_mask: 0.2503, loss: 0.7636 2024-05-29 17:30:37,303 - mmdet - INFO - Saving checkpoint at 5 epochs 2024-05-29 17:32:32,666 - mmdet - INFO - Evaluating bbox... 2024-05-29 17:32:56,965 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.415 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.648 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.454 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.246 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.456 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.567 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.541 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.541 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.541 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.347 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.587 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.700 2024-05-29 17:32:56,966 - mmdet - INFO - Evaluating segm... 2024-05-29 17:33:25,785 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.379 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.611 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.402 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.172 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.415 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.586 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.494 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.494 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.494 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.287 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.545 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.675 2024-05-29 17:33:26,174 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 17:33:26,175 - mmdet - INFO - Epoch(val) [5][625] bbox_mAP: 0.4150, bbox_mAP_50: 0.6480, bbox_mAP_75: 0.4540, bbox_mAP_s: 0.2460, bbox_mAP_m: 0.4560, bbox_mAP_l: 0.5670, bbox_mAP_copypaste: 0.415 0.648 0.454 0.246 0.456 0.567, segm_mAP: 0.3790, segm_mAP_50: 0.6110, segm_mAP_75: 0.4020, segm_mAP_s: 0.1720, segm_mAP_m: 0.4150, segm_mAP_l: 0.5860, segm_mAP_copypaste: 0.379 0.611 0.402 0.172 0.415 0.586 2024-05-29 17:34:03,705 - mmdet - INFO - Epoch [6][50/7330] lr: 1.000e-04, eta: 9:49:06, time: 0.750, data_time: 0.114, memory: 11628, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0446, loss_cls: 0.1784, acc: 93.3723, loss_bbox: 0.2257, loss_mask: 0.2354, loss: 0.7027 2024-05-29 17:34:37,143 - mmdet - INFO - Epoch [6][100/7330] lr: 1.000e-04, eta: 9:48:31, time: 0.669, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0426, loss_cls: 0.1877, acc: 93.1460, loss_bbox: 0.2322, loss_mask: 0.2377, loss: 0.7176 2024-05-29 17:35:12,941 - mmdet - INFO - Epoch [6][150/7330] lr: 1.000e-04, eta: 9:47:58, time: 0.716, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0454, loss_cls: 0.1938, acc: 93.0044, loss_bbox: 0.2389, loss_mask: 0.2395, loss: 0.7372 2024-05-29 17:35:46,765 - mmdet - INFO - Epoch [6][200/7330] lr: 1.000e-04, eta: 9:47:22, time: 0.676, data_time: 0.038, memory: 11628, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0478, loss_cls: 0.1900, acc: 92.9604, loss_bbox: 0.2373, loss_mask: 0.2391, loss: 0.7339 2024-05-29 17:36:20,889 - mmdet - INFO - Epoch [6][250/7330] lr: 1.000e-04, eta: 9:46:48, time: 0.682, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0498, loss_cls: 0.1897, acc: 93.0183, loss_bbox: 0.2447, loss_mask: 0.2474, loss: 0.7511 2024-05-29 17:36:53,894 - mmdet - INFO - Epoch [6][300/7330] lr: 1.000e-04, eta: 9:46:11, time: 0.660, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0456, loss_cls: 0.1827, acc: 93.2812, loss_bbox: 0.2305, loss_mask: 0.2328, loss: 0.7124 2024-05-29 17:37:27,240 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 17:37:27,241 - mmdet - INFO - Epoch [6][350/7330] lr: 1.000e-04, eta: 9:45:35, time: 0.667, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0428, loss_cls: 0.1844, acc: 93.2021, loss_bbox: 0.2378, loss_mask: 0.2420, loss: 0.7259 2024-05-29 17:38:00,862 - mmdet - INFO - Epoch [6][400/7330] lr: 1.000e-04, eta: 9:44:59, time: 0.673, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0442, loss_cls: 0.1792, acc: 93.5103, loss_bbox: 0.2270, loss_mask: 0.2375, loss: 0.7059 2024-05-29 17:38:34,481 - mmdet - INFO - Epoch [6][450/7330] lr: 1.000e-04, eta: 9:44:24, time: 0.672, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0485, loss_cls: 0.2045, acc: 92.5706, loss_bbox: 0.2570, loss_mask: 0.2520, loss: 0.7823 2024-05-29 17:39:08,281 - mmdet - INFO - Epoch [6][500/7330] lr: 1.000e-04, eta: 9:43:48, time: 0.676, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0478, loss_cls: 0.1925, acc: 92.9814, loss_bbox: 0.2423, loss_mask: 0.2416, loss: 0.7436 2024-05-29 17:39:41,947 - mmdet - INFO - Epoch [6][550/7330] lr: 1.000e-04, eta: 9:43:13, time: 0.673, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0473, loss_cls: 0.1907, acc: 93.0989, loss_bbox: 0.2368, loss_mask: 0.2359, loss: 0.7304 2024-05-29 17:40:15,204 - mmdet - INFO - Epoch [6][600/7330] lr: 1.000e-04, eta: 9:42:37, time: 0.665, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0439, loss_cls: 0.1825, acc: 93.4873, loss_bbox: 0.2268, loss_mask: 0.2315, loss: 0.7029 2024-05-29 17:40:48,360 - mmdet - INFO - Epoch [6][650/7330] lr: 1.000e-04, eta: 9:42:00, time: 0.663, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0472, loss_cls: 0.1841, acc: 93.2334, loss_bbox: 0.2320, loss_mask: 0.2405, loss: 0.7227 2024-05-29 17:41:22,144 - mmdet - INFO - Epoch [6][700/7330] lr: 1.000e-04, eta: 9:41:25, time: 0.676, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0485, loss_cls: 0.1945, acc: 92.9226, loss_bbox: 0.2381, loss_mask: 0.2373, loss: 0.7389 2024-05-29 17:41:55,419 - mmdet - INFO - Epoch [6][750/7330] lr: 1.000e-04, eta: 9:40:49, time: 0.665, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0454, loss_cls: 0.1868, acc: 93.1160, loss_bbox: 0.2340, loss_mask: 0.2396, loss: 0.7242 2024-05-29 17:42:31,577 - mmdet - INFO - Epoch [6][800/7330] lr: 1.000e-04, eta: 9:40:17, time: 0.723, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0480, loss_cls: 0.1861, acc: 93.1536, loss_bbox: 0.2356, loss_mask: 0.2441, loss: 0.7325 2024-05-29 17:43:07,175 - mmdet - INFO - Epoch [6][850/7330] lr: 1.000e-04, eta: 9:39:44, time: 0.712, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0453, loss_cls: 0.1910, acc: 93.0293, loss_bbox: 0.2382, loss_mask: 0.2413, loss: 0.7362 2024-05-29 17:43:42,843 - mmdet - INFO - Epoch [6][900/7330] lr: 1.000e-04, eta: 9:39:11, time: 0.713, data_time: 0.067, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0444, loss_cls: 0.1909, acc: 93.0806, loss_bbox: 0.2359, loss_mask: 0.2391, loss: 0.7294 2024-05-29 17:44:16,195 - mmdet - INFO - Epoch [6][950/7330] lr: 1.000e-04, eta: 9:38:35, time: 0.667, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0454, loss_cls: 0.1901, acc: 93.1294, loss_bbox: 0.2381, loss_mask: 0.2455, loss: 0.7401 2024-05-29 17:44:49,478 - mmdet - INFO - Epoch [6][1000/7330] lr: 1.000e-04, eta: 9:37:59, time: 0.666, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0442, loss_cls: 0.1822, acc: 93.3911, loss_bbox: 0.2286, loss_mask: 0.2369, loss: 0.7106 2024-05-29 17:45:27,275 - mmdet - INFO - Epoch [6][1050/7330] lr: 1.000e-04, eta: 9:37:29, time: 0.756, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0435, loss_cls: 0.1883, acc: 93.3101, loss_bbox: 0.2299, loss_mask: 0.2374, loss: 0.7171 2024-05-29 17:46:03,174 - mmdet - INFO - Epoch [6][1100/7330] lr: 1.000e-04, eta: 9:36:56, time: 0.718, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0461, loss_cls: 0.1933, acc: 92.9019, loss_bbox: 0.2448, loss_mask: 0.2447, loss: 0.7476 2024-05-29 17:46:38,861 - mmdet - INFO - Epoch [6][1150/7330] lr: 1.000e-04, eta: 9:36:23, time: 0.714, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0481, loss_cls: 0.1871, acc: 93.1570, loss_bbox: 0.2381, loss_mask: 0.2410, loss: 0.7333 2024-05-29 17:47:12,422 - mmdet - INFO - Epoch [6][1200/7330] lr: 1.000e-04, eta: 9:35:48, time: 0.671, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0455, loss_cls: 0.1821, acc: 93.3030, loss_bbox: 0.2295, loss_mask: 0.2331, loss: 0.7102 2024-05-29 17:47:49,380 - mmdet - INFO - Epoch [6][1250/7330] lr: 1.000e-04, eta: 9:35:17, time: 0.739, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0461, loss_cls: 0.1870, acc: 93.1978, loss_bbox: 0.2373, loss_mask: 0.2340, loss: 0.7235 2024-05-29 17:48:22,853 - mmdet - INFO - Epoch [6][1300/7330] lr: 1.000e-04, eta: 9:34:41, time: 0.669, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0481, loss_cls: 0.1972, acc: 92.9690, loss_bbox: 0.2432, loss_mask: 0.2375, loss: 0.7446 2024-05-29 17:48:57,054 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 17:48:57,054 - mmdet - INFO - Epoch [6][1350/7330] lr: 1.000e-04, eta: 9:34:06, time: 0.684, data_time: 0.065, memory: 11628, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0476, loss_cls: 0.1980, acc: 92.7441, loss_bbox: 0.2492, loss_mask: 0.2432, loss: 0.7580 2024-05-29 17:49:30,670 - mmdet - INFO - Epoch [6][1400/7330] lr: 1.000e-04, eta: 9:33:30, time: 0.672, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0451, loss_cls: 0.1840, acc: 93.2339, loss_bbox: 0.2328, loss_mask: 0.2382, loss: 0.7187 2024-05-29 17:50:04,088 - mmdet - INFO - Epoch [6][1450/7330] lr: 1.000e-04, eta: 9:32:54, time: 0.668, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0463, loss_cls: 0.1951, acc: 92.9775, loss_bbox: 0.2453, loss_mask: 0.2452, loss: 0.7527 2024-05-29 17:50:37,485 - mmdet - INFO - Epoch [6][1500/7330] lr: 1.000e-04, eta: 9:32:19, time: 0.668, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0457, loss_cls: 0.1905, acc: 93.0803, loss_bbox: 0.2369, loss_mask: 0.2419, loss: 0.7343 2024-05-29 17:51:11,555 - mmdet - INFO - Epoch [6][1550/7330] lr: 1.000e-04, eta: 9:31:44, time: 0.681, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0480, loss_cls: 0.1979, acc: 92.8337, loss_bbox: 0.2440, loss_mask: 0.2396, loss: 0.7501 2024-05-29 17:51:45,769 - mmdet - INFO - Epoch [6][1600/7330] lr: 1.000e-04, eta: 9:31:09, time: 0.684, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0468, loss_cls: 0.1946, acc: 92.9629, loss_bbox: 0.2399, loss_mask: 0.2481, loss: 0.7509 2024-05-29 17:52:18,641 - mmdet - INFO - Epoch [6][1650/7330] lr: 1.000e-04, eta: 9:30:32, time: 0.657, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0446, loss_cls: 0.1835, acc: 93.2852, loss_bbox: 0.2352, loss_mask: 0.2405, loss: 0.7231 2024-05-29 17:52:54,070 - mmdet - INFO - Epoch [6][1700/7330] lr: 1.000e-04, eta: 9:29:59, time: 0.709, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0459, loss_cls: 0.1900, acc: 93.0627, loss_bbox: 0.2429, loss_mask: 0.2420, loss: 0.7391 2024-05-29 17:53:29,250 - mmdet - INFO - Epoch [6][1750/7330] lr: 1.000e-04, eta: 9:29:25, time: 0.704, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0447, loss_cls: 0.1869, acc: 93.2393, loss_bbox: 0.2281, loss_mask: 0.2307, loss: 0.7085 2024-05-29 17:54:05,078 - mmdet - INFO - Epoch [6][1800/7330] lr: 1.000e-04, eta: 9:28:53, time: 0.717, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0436, loss_cls: 0.1927, acc: 92.9553, loss_bbox: 0.2443, loss_mask: 0.2403, loss: 0.7401 2024-05-29 17:54:38,514 - mmdet - INFO - Epoch [6][1850/7330] lr: 1.000e-04, eta: 9:28:17, time: 0.669, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0460, loss_cls: 0.1865, acc: 93.1699, loss_bbox: 0.2377, loss_mask: 0.2380, loss: 0.7270 2024-05-29 17:55:14,370 - mmdet - INFO - Epoch [6][1900/7330] lr: 1.000e-04, eta: 9:27:44, time: 0.717, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0476, loss_cls: 0.1972, acc: 92.7400, loss_bbox: 0.2483, loss_mask: 0.2437, loss: 0.7593 2024-05-29 17:55:50,436 - mmdet - INFO - Epoch [6][1950/7330] lr: 1.000e-04, eta: 9:27:12, time: 0.721, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0456, loss_cls: 0.1993, acc: 92.9075, loss_bbox: 0.2364, loss_mask: 0.2372, loss: 0.7381 2024-05-29 17:56:26,129 - mmdet - INFO - Epoch [6][2000/7330] lr: 1.000e-04, eta: 9:26:39, time: 0.714, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0452, loss_cls: 0.1843, acc: 93.2434, loss_bbox: 0.2325, loss_mask: 0.2330, loss: 0.7141 2024-05-29 17:57:02,458 - mmdet - INFO - Epoch [6][2050/7330] lr: 1.000e-04, eta: 9:26:07, time: 0.727, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0476, loss_cls: 0.1903, acc: 93.0642, loss_bbox: 0.2373, loss_mask: 0.2370, loss: 0.7316 2024-05-29 17:57:36,578 - mmdet - INFO - Epoch [6][2100/7330] lr: 1.000e-04, eta: 9:25:32, time: 0.682, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0477, loss_cls: 0.2017, acc: 92.7451, loss_bbox: 0.2465, loss_mask: 0.2471, loss: 0.7640 2024-05-29 17:58:13,198 - mmdet - INFO - Epoch [6][2150/7330] lr: 1.000e-04, eta: 9:25:00, time: 0.732, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0454, loss_cls: 0.1883, acc: 93.2363, loss_bbox: 0.2348, loss_mask: 0.2415, loss: 0.7284 2024-05-29 17:58:47,249 - mmdet - INFO - Epoch [6][2200/7330] lr: 1.000e-04, eta: 9:24:25, time: 0.681, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0507, loss_cls: 0.1959, acc: 92.8281, loss_bbox: 0.2421, loss_mask: 0.2379, loss: 0.7499 2024-05-29 17:59:20,521 - mmdet - INFO - Epoch [6][2250/7330] lr: 1.000e-04, eta: 9:23:49, time: 0.666, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0463, loss_cls: 0.1919, acc: 93.0352, loss_bbox: 0.2412, loss_mask: 0.2466, loss: 0.7461 2024-05-29 17:59:55,108 - mmdet - INFO - Epoch [6][2300/7330] lr: 1.000e-04, eta: 9:23:15, time: 0.692, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0473, loss_cls: 0.1933, acc: 93.0176, loss_bbox: 0.2414, loss_mask: 0.2428, loss: 0.7457 2024-05-29 18:00:28,711 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 18:00:28,711 - mmdet - INFO - Epoch [6][2350/7330] lr: 1.000e-04, eta: 9:22:39, time: 0.672, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0453, loss_cls: 0.1851, acc: 93.2241, loss_bbox: 0.2331, loss_mask: 0.2382, loss: 0.7208 2024-05-29 18:01:02,669 - mmdet - INFO - Epoch [6][2400/7330] lr: 1.000e-04, eta: 9:22:04, time: 0.679, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0483, loss_cls: 0.1996, acc: 92.8328, loss_bbox: 0.2463, loss_mask: 0.2484, loss: 0.7633 2024-05-29 18:01:36,011 - mmdet - INFO - Epoch [6][2450/7330] lr: 1.000e-04, eta: 9:21:28, time: 0.667, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0458, loss_cls: 0.1894, acc: 93.1882, loss_bbox: 0.2335, loss_mask: 0.2366, loss: 0.7255 2024-05-29 18:02:09,000 - mmdet - INFO - Epoch [6][2500/7330] lr: 1.000e-04, eta: 9:20:52, time: 0.660, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0458, loss_cls: 0.1917, acc: 93.0479, loss_bbox: 0.2408, loss_mask: 0.2422, loss: 0.7400 2024-05-29 18:02:45,240 - mmdet - INFO - Epoch [6][2550/7330] lr: 1.000e-04, eta: 9:20:20, time: 0.725, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0487, loss_cls: 0.1937, acc: 92.8984, loss_bbox: 0.2414, loss_mask: 0.2429, loss: 0.7465 2024-05-29 18:03:18,623 - mmdet - INFO - Epoch [6][2600/7330] lr: 1.000e-04, eta: 9:19:44, time: 0.668, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0453, loss_cls: 0.1896, acc: 93.1057, loss_bbox: 0.2409, loss_mask: 0.2357, loss: 0.7306 2024-05-29 18:03:55,730 - mmdet - INFO - Epoch [6][2650/7330] lr: 1.000e-04, eta: 9:19:12, time: 0.742, data_time: 0.037, memory: 11628, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0437, loss_cls: 0.1933, acc: 93.0581, loss_bbox: 0.2391, loss_mask: 0.2408, loss: 0.7345 2024-05-29 18:04:32,166 - mmdet - INFO - Epoch [6][2700/7330] lr: 1.000e-04, eta: 9:18:40, time: 0.729, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0452, loss_cls: 0.1900, acc: 93.0366, loss_bbox: 0.2338, loss_mask: 0.2389, loss: 0.7278 2024-05-29 18:05:05,521 - mmdet - INFO - Epoch [6][2750/7330] lr: 1.000e-04, eta: 9:18:05, time: 0.667, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0451, loss_cls: 0.1918, acc: 93.0183, loss_bbox: 0.2394, loss_mask: 0.2376, loss: 0.7326 2024-05-29 18:05:41,188 - mmdet - INFO - Epoch [6][2800/7330] lr: 1.000e-04, eta: 9:17:32, time: 0.713, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0449, loss_cls: 0.1942, acc: 92.9546, loss_bbox: 0.2396, loss_mask: 0.2425, loss: 0.7406 2024-05-29 18:06:19,186 - mmdet - INFO - Epoch [6][2850/7330] lr: 1.000e-04, eta: 9:17:01, time: 0.760, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0445, loss_cls: 0.1845, acc: 93.3303, loss_bbox: 0.2323, loss_mask: 0.2381, loss: 0.7182 2024-05-29 18:06:55,398 - mmdet - INFO - Epoch [6][2900/7330] lr: 1.000e-04, eta: 9:16:29, time: 0.724, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0459, loss_cls: 0.1939, acc: 92.9949, loss_bbox: 0.2373, loss_mask: 0.2406, loss: 0.7382 2024-05-29 18:07:28,871 - mmdet - INFO - Epoch [6][2950/7330] lr: 1.000e-04, eta: 9:15:53, time: 0.669, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0458, loss_cls: 0.1922, acc: 93.0688, loss_bbox: 0.2412, loss_mask: 0.2401, loss: 0.7396 2024-05-29 18:08:05,598 - mmdet - INFO - Epoch [6][3000/7330] lr: 1.000e-04, eta: 9:15:22, time: 0.735, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0461, loss_cls: 0.1990, acc: 92.7659, loss_bbox: 0.2458, loss_mask: 0.2461, loss: 0.7575 2024-05-29 18:08:39,104 - mmdet - INFO - Epoch [6][3050/7330] lr: 1.000e-04, eta: 9:14:46, time: 0.670, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0470, loss_cls: 0.1914, acc: 93.0762, loss_bbox: 0.2416, loss_mask: 0.2386, loss: 0.7378 2024-05-29 18:09:12,658 - mmdet - INFO - Epoch [6][3100/7330] lr: 1.000e-04, eta: 9:14:10, time: 0.671, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0459, loss_cls: 0.1938, acc: 93.1011, loss_bbox: 0.2349, loss_mask: 0.2389, loss: 0.7346 2024-05-29 18:09:46,253 - mmdet - INFO - Epoch [6][3150/7330] lr: 1.000e-04, eta: 9:13:35, time: 0.672, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0459, loss_cls: 0.1868, acc: 93.2756, loss_bbox: 0.2290, loss_mask: 0.2342, loss: 0.7150 2024-05-29 18:10:20,058 - mmdet - INFO - Epoch [6][3200/7330] lr: 1.000e-04, eta: 9:12:59, time: 0.676, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0466, loss_cls: 0.1905, acc: 92.9207, loss_bbox: 0.2423, loss_mask: 0.2441, loss: 0.7428 2024-05-29 18:10:53,322 - mmdet - INFO - Epoch [6][3250/7330] lr: 1.000e-04, eta: 9:12:23, time: 0.665, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0448, loss_cls: 0.1904, acc: 93.1099, loss_bbox: 0.2354, loss_mask: 0.2390, loss: 0.7281 2024-05-29 18:11:27,361 - mmdet - INFO - Epoch [6][3300/7330] lr: 1.000e-04, eta: 9:11:48, time: 0.681, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0477, loss_cls: 0.1927, acc: 93.0962, loss_bbox: 0.2397, loss_mask: 0.2484, loss: 0.7485 2024-05-29 18:12:01,315 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 18:12:01,316 - mmdet - INFO - Epoch [6][3350/7330] lr: 1.000e-04, eta: 9:11:13, time: 0.679, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0487, loss_cls: 0.1912, acc: 93.0488, loss_bbox: 0.2456, loss_mask: 0.2438, loss: 0.7503 2024-05-29 18:12:34,516 - mmdet - INFO - Epoch [6][3400/7330] lr: 1.000e-04, eta: 9:10:37, time: 0.664, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0470, loss_cls: 0.1880, acc: 93.1262, loss_bbox: 0.2354, loss_mask: 0.2383, loss: 0.7278 2024-05-29 18:13:09,956 - mmdet - INFO - Epoch [6][3450/7330] lr: 1.000e-04, eta: 9:10:04, time: 0.709, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0446, loss_cls: 0.1899, acc: 93.2102, loss_bbox: 0.2334, loss_mask: 0.2383, loss: 0.7250 2024-05-29 18:13:43,265 - mmdet - INFO - Epoch [6][3500/7330] lr: 1.000e-04, eta: 9:09:28, time: 0.666, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0465, loss_cls: 0.1956, acc: 92.8396, loss_bbox: 0.2432, loss_mask: 0.2406, loss: 0.7457 2024-05-29 18:14:21,269 - mmdet - INFO - Epoch [6][3550/7330] lr: 1.000e-04, eta: 9:08:58, time: 0.760, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0465, loss_cls: 0.1939, acc: 93.0300, loss_bbox: 0.2403, loss_mask: 0.2393, loss: 0.7387 2024-05-29 18:14:58,883 - mmdet - INFO - Epoch [6][3600/7330] lr: 1.000e-04, eta: 9:08:27, time: 0.752, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0442, loss_cls: 0.1857, acc: 93.2849, loss_bbox: 0.2269, loss_mask: 0.2358, loss: 0.7114 2024-05-29 18:15:32,575 - mmdet - INFO - Epoch [6][3650/7330] lr: 1.000e-04, eta: 9:07:52, time: 0.674, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0481, loss_cls: 0.1962, acc: 93.0640, loss_bbox: 0.2407, loss_mask: 0.2424, loss: 0.7471 2024-05-29 18:16:11,030 - mmdet - INFO - Epoch [6][3700/7330] lr: 1.000e-04, eta: 9:07:22, time: 0.769, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0460, loss_cls: 0.1902, acc: 93.1440, loss_bbox: 0.2327, loss_mask: 0.2396, loss: 0.7282 2024-05-29 18:16:46,775 - mmdet - INFO - Epoch [6][3750/7330] lr: 1.000e-04, eta: 9:06:49, time: 0.715, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0434, loss_cls: 0.1894, acc: 93.1694, loss_bbox: 0.2348, loss_mask: 0.2379, loss: 0.7236 2024-05-29 18:17:22,808 - mmdet - INFO - Epoch [6][3800/7330] lr: 1.000e-04, eta: 9:06:16, time: 0.721, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0463, loss_cls: 0.1902, acc: 93.1294, loss_bbox: 0.2386, loss_mask: 0.2396, loss: 0.7348 2024-05-29 18:17:56,311 - mmdet - INFO - Epoch [6][3850/7330] lr: 1.000e-04, eta: 9:05:40, time: 0.670, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0455, loss_cls: 0.1878, acc: 93.2053, loss_bbox: 0.2352, loss_mask: 0.2398, loss: 0.7265 2024-05-29 18:18:33,779 - mmdet - INFO - Epoch [6][3900/7330] lr: 1.000e-04, eta: 9:05:09, time: 0.749, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0457, loss_cls: 0.1956, acc: 92.9814, loss_bbox: 0.2381, loss_mask: 0.2356, loss: 0.7341 2024-05-29 18:19:07,037 - mmdet - INFO - Epoch [6][3950/7330] lr: 1.000e-04, eta: 9:04:33, time: 0.665, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0468, loss_cls: 0.1911, acc: 93.0181, loss_bbox: 0.2380, loss_mask: 0.2441, loss: 0.7398 2024-05-29 18:19:41,485 - mmdet - INFO - Epoch [6][4000/7330] lr: 1.000e-04, eta: 9:03:59, time: 0.689, data_time: 0.062, memory: 11628, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0452, loss_cls: 0.1969, acc: 92.8701, loss_bbox: 0.2420, loss_mask: 0.2444, loss: 0.7498 2024-05-29 18:20:15,463 - mmdet - INFO - Epoch [6][4050/7330] lr: 1.000e-04, eta: 9:03:24, time: 0.680, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0446, loss_cls: 0.1929, acc: 93.0027, loss_bbox: 0.2353, loss_mask: 0.2387, loss: 0.7312 2024-05-29 18:20:49,546 - mmdet - INFO - Epoch [6][4100/7330] lr: 1.000e-04, eta: 9:02:49, time: 0.681, data_time: 0.062, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0430, loss_cls: 0.1908, acc: 93.1282, loss_bbox: 0.2328, loss_mask: 0.2356, loss: 0.7213 2024-05-29 18:21:23,333 - mmdet - INFO - Epoch [6][4150/7330] lr: 1.000e-04, eta: 9:02:14, time: 0.676, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0475, loss_cls: 0.1955, acc: 92.9688, loss_bbox: 0.2393, loss_mask: 0.2417, loss: 0.7442 2024-05-29 18:21:57,012 - mmdet - INFO - Epoch [6][4200/7330] lr: 1.000e-04, eta: 9:01:38, time: 0.674, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0466, loss_cls: 0.1955, acc: 92.8738, loss_bbox: 0.2366, loss_mask: 0.2393, loss: 0.7383 2024-05-29 18:22:31,515 - mmdet - INFO - Epoch [6][4250/7330] lr: 1.000e-04, eta: 9:01:04, time: 0.690, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0459, loss_cls: 0.1933, acc: 93.0405, loss_bbox: 0.2358, loss_mask: 0.2386, loss: 0.7334 2024-05-29 18:23:04,891 - mmdet - INFO - Epoch [6][4300/7330] lr: 1.000e-04, eta: 9:00:28, time: 0.668, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0475, loss_cls: 0.1907, acc: 92.9771, loss_bbox: 0.2351, loss_mask: 0.2389, loss: 0.7321 2024-05-29 18:23:41,119 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 18:23:41,119 - mmdet - INFO - Epoch [6][4350/7330] lr: 1.000e-04, eta: 8:59:55, time: 0.725, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0460, loss_cls: 0.1888, acc: 93.1309, loss_bbox: 0.2355, loss_mask: 0.2395, loss: 0.7289 2024-05-29 18:24:17,511 - mmdet - INFO - Epoch [6][4400/7330] lr: 1.000e-04, eta: 8:59:23, time: 0.728, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0472, loss_cls: 0.2061, acc: 92.6331, loss_bbox: 0.2476, loss_mask: 0.2487, loss: 0.7711 2024-05-29 18:24:53,631 - mmdet - INFO - Epoch [6][4450/7330] lr: 1.000e-04, eta: 8:58:50, time: 0.722, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0457, loss_cls: 0.1882, acc: 93.1943, loss_bbox: 0.2345, loss_mask: 0.2409, loss: 0.7288 2024-05-29 18:25:27,751 - mmdet - INFO - Epoch [6][4500/7330] lr: 1.000e-04, eta: 8:58:15, time: 0.682, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0475, loss_cls: 0.1981, acc: 92.8521, loss_bbox: 0.2463, loss_mask: 0.2435, loss: 0.7566 2024-05-29 18:26:03,722 - mmdet - INFO - Epoch [6][4550/7330] lr: 1.000e-04, eta: 8:57:43, time: 0.719, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0438, loss_cls: 0.1876, acc: 93.1255, loss_bbox: 0.2326, loss_mask: 0.2299, loss: 0.7115 2024-05-29 18:26:39,557 - mmdet - INFO - Epoch [6][4600/7330] lr: 1.000e-04, eta: 8:57:10, time: 0.717, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0484, loss_cls: 0.2030, acc: 92.5427, loss_bbox: 0.2540, loss_mask: 0.2435, loss: 0.7689 2024-05-29 18:27:16,888 - mmdet - INFO - Epoch [6][4650/7330] lr: 1.000e-04, eta: 8:56:38, time: 0.747, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0490, loss_cls: 0.1961, acc: 92.7886, loss_bbox: 0.2440, loss_mask: 0.2368, loss: 0.7470 2024-05-29 18:27:52,897 - mmdet - INFO - Epoch [6][4700/7330] lr: 1.000e-04, eta: 8:56:06, time: 0.720, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0469, loss_cls: 0.1974, acc: 92.8994, loss_bbox: 0.2455, loss_mask: 0.2474, loss: 0.7584 2024-05-29 18:28:26,815 - mmdet - INFO - Epoch [6][4750/7330] lr: 1.000e-04, eta: 8:55:30, time: 0.678, data_time: 0.067, memory: 11628, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0475, loss_cls: 0.1902, acc: 93.1348, loss_bbox: 0.2382, loss_mask: 0.2416, loss: 0.7384 2024-05-29 18:29:03,266 - mmdet - INFO - Epoch [6][4800/7330] lr: 1.000e-04, eta: 8:54:58, time: 0.729, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0431, loss_cls: 0.1867, acc: 93.3748, loss_bbox: 0.2263, loss_mask: 0.2346, loss: 0.7101 2024-05-29 18:29:36,697 - mmdet - INFO - Epoch [6][4850/7330] lr: 1.000e-04, eta: 8:54:22, time: 0.669, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0457, loss_cls: 0.1906, acc: 93.1296, loss_bbox: 0.2347, loss_mask: 0.2366, loss: 0.7266 2024-05-29 18:30:09,981 - mmdet - INFO - Epoch [6][4900/7330] lr: 1.000e-04, eta: 8:53:46, time: 0.665, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0473, loss_cls: 0.1940, acc: 93.0398, loss_bbox: 0.2389, loss_mask: 0.2404, loss: 0.7388 2024-05-29 18:30:43,406 - mmdet - INFO - Epoch [6][4950/7330] lr: 1.000e-04, eta: 8:53:11, time: 0.669, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0437, loss_cls: 0.1842, acc: 93.3508, loss_bbox: 0.2318, loss_mask: 0.2382, loss: 0.7186 2024-05-29 18:31:16,644 - mmdet - INFO - Epoch [6][5000/7330] lr: 1.000e-04, eta: 8:52:35, time: 0.665, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0443, loss_cls: 0.1900, acc: 93.1953, loss_bbox: 0.2323, loss_mask: 0.2401, loss: 0.7267 2024-05-29 18:31:49,777 - mmdet - INFO - Epoch [6][5050/7330] lr: 1.000e-04, eta: 8:51:59, time: 0.663, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0471, loss_cls: 0.1950, acc: 92.8809, loss_bbox: 0.2459, loss_mask: 0.2387, loss: 0.7479 2024-05-29 18:32:23,455 - mmdet - INFO - Epoch [6][5100/7330] lr: 1.000e-04, eta: 8:51:23, time: 0.674, data_time: 0.063, memory: 11628, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0441, loss_cls: 0.1917, acc: 93.0347, loss_bbox: 0.2353, loss_mask: 0.2385, loss: 0.7306 2024-05-29 18:32:56,821 - mmdet - INFO - Epoch [6][5150/7330] lr: 1.000e-04, eta: 8:50:48, time: 0.667, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0467, loss_cls: 0.1871, acc: 93.2163, loss_bbox: 0.2324, loss_mask: 0.2362, loss: 0.7230 2024-05-29 18:33:30,677 - mmdet - INFO - Epoch [6][5200/7330] lr: 1.000e-04, eta: 8:50:12, time: 0.677, data_time: 0.064, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0472, loss_cls: 0.1891, acc: 93.0486, loss_bbox: 0.2382, loss_mask: 0.2435, loss: 0.7370 2024-05-29 18:34:06,371 - mmdet - INFO - Epoch [6][5250/7330] lr: 1.000e-04, eta: 8:49:39, time: 0.714, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0477, loss_cls: 0.1942, acc: 92.9875, loss_bbox: 0.2395, loss_mask: 0.2372, loss: 0.7394 2024-05-29 18:34:41,862 - mmdet - INFO - Epoch [6][5300/7330] lr: 1.000e-04, eta: 8:49:06, time: 0.710, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0456, loss_cls: 0.1917, acc: 93.0408, loss_bbox: 0.2365, loss_mask: 0.2364, loss: 0.7290 2024-05-29 18:35:18,555 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 18:35:18,555 - mmdet - INFO - Epoch [6][5350/7330] lr: 1.000e-04, eta: 8:48:34, time: 0.734, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0478, loss_cls: 0.1942, acc: 92.9565, loss_bbox: 0.2361, loss_mask: 0.2373, loss: 0.7352 2024-05-29 18:35:52,924 - mmdet - INFO - Epoch [6][5400/7330] lr: 1.000e-04, eta: 8:47:59, time: 0.687, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0468, loss_cls: 0.2057, acc: 92.6235, loss_bbox: 0.2449, loss_mask: 0.2420, loss: 0.7599 2024-05-29 18:36:28,082 - mmdet - INFO - Epoch [6][5450/7330] lr: 1.000e-04, eta: 8:47:25, time: 0.703, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0453, loss_cls: 0.1916, acc: 93.0945, loss_bbox: 0.2366, loss_mask: 0.2399, loss: 0.7337 2024-05-29 18:37:03,747 - mmdet - INFO - Epoch [6][5500/7330] lr: 1.000e-04, eta: 8:46:52, time: 0.713, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0435, loss_cls: 0.1859, acc: 93.2104, loss_bbox: 0.2313, loss_mask: 0.2380, loss: 0.7177 2024-05-29 18:37:39,378 - mmdet - INFO - Epoch [6][5550/7330] lr: 1.000e-04, eta: 8:46:19, time: 0.713, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0475, loss_cls: 0.1910, acc: 93.0645, loss_bbox: 0.2388, loss_mask: 0.2447, loss: 0.7408 2024-05-29 18:38:15,328 - mmdet - INFO - Epoch [6][5600/7330] lr: 1.000e-04, eta: 8:45:46, time: 0.719, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0448, loss_cls: 0.1920, acc: 93.1018, loss_bbox: 0.2369, loss_mask: 0.2399, loss: 0.7350 2024-05-29 18:38:49,223 - mmdet - INFO - Epoch [6][5650/7330] lr: 1.000e-04, eta: 8:45:11, time: 0.678, data_time: 0.066, memory: 11628, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0444, loss_cls: 0.1901, acc: 93.0781, loss_bbox: 0.2368, loss_mask: 0.2396, loss: 0.7308 2024-05-29 18:39:26,108 - mmdet - INFO - Epoch [6][5700/7330] lr: 1.000e-04, eta: 8:44:39, time: 0.738, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0455, loss_cls: 0.1864, acc: 93.2302, loss_bbox: 0.2323, loss_mask: 0.2450, loss: 0.7290 2024-05-29 18:39:59,775 - mmdet - INFO - Epoch [6][5750/7330] lr: 1.000e-04, eta: 8:44:03, time: 0.673, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0466, loss_cls: 0.1909, acc: 93.1777, loss_bbox: 0.2328, loss_mask: 0.2451, loss: 0.7339 2024-05-29 18:40:33,941 - mmdet - INFO - Epoch [6][5800/7330] lr: 1.000e-04, eta: 8:43:28, time: 0.683, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0452, loss_cls: 0.1917, acc: 92.9668, loss_bbox: 0.2374, loss_mask: 0.2395, loss: 0.7320 2024-05-29 18:41:07,039 - mmdet - INFO - Epoch [6][5850/7330] lr: 1.000e-04, eta: 8:42:52, time: 0.662, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0470, loss_cls: 0.1969, acc: 92.8320, loss_bbox: 0.2390, loss_mask: 0.2415, loss: 0.7451 2024-05-29 18:41:41,136 - mmdet - INFO - Epoch [6][5900/7330] lr: 1.000e-04, eta: 8:42:18, time: 0.682, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0458, loss_cls: 0.1922, acc: 93.0789, loss_bbox: 0.2362, loss_mask: 0.2352, loss: 0.7294 2024-05-29 18:42:14,790 - mmdet - INFO - Epoch [6][5950/7330] lr: 1.000e-04, eta: 8:41:42, time: 0.673, data_time: 0.065, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0456, loss_cls: 0.1915, acc: 93.0076, loss_bbox: 0.2440, loss_mask: 0.2417, loss: 0.7416 2024-05-29 18:42:48,428 - mmdet - INFO - Epoch [6][6000/7330] lr: 1.000e-04, eta: 8:41:07, time: 0.673, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0450, loss_cls: 0.1845, acc: 93.2146, loss_bbox: 0.2332, loss_mask: 0.2436, loss: 0.7249 2024-05-29 18:43:21,672 - mmdet - INFO - Epoch [6][6050/7330] lr: 1.000e-04, eta: 8:40:31, time: 0.665, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0449, loss_cls: 0.1883, acc: 93.2097, loss_bbox: 0.2356, loss_mask: 0.2425, loss: 0.7286 2024-05-29 18:43:57,367 - mmdet - INFO - Epoch [6][6100/7330] lr: 1.000e-04, eta: 8:39:58, time: 0.714, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0474, loss_cls: 0.1948, acc: 92.9290, loss_bbox: 0.2426, loss_mask: 0.2435, loss: 0.7511 2024-05-29 18:44:31,288 - mmdet - INFO - Epoch [6][6150/7330] lr: 1.000e-04, eta: 8:39:22, time: 0.678, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0468, loss_cls: 0.1979, acc: 92.7534, loss_bbox: 0.2471, loss_mask: 0.2480, loss: 0.7608 2024-05-29 18:45:07,526 - mmdet - INFO - Epoch [6][6200/7330] lr: 1.000e-04, eta: 8:38:50, time: 0.725, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0442, loss_cls: 0.1930, acc: 93.0183, loss_bbox: 0.2359, loss_mask: 0.2371, loss: 0.7293 2024-05-29 18:45:44,024 - mmdet - INFO - Epoch [6][6250/7330] lr: 1.000e-04, eta: 8:38:17, time: 0.730, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0477, loss_cls: 0.1934, acc: 92.9475, loss_bbox: 0.2448, loss_mask: 0.2397, loss: 0.7459 2024-05-29 18:46:17,902 - mmdet - INFO - Epoch [6][6300/7330] lr: 1.000e-04, eta: 8:37:42, time: 0.678, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0452, loss_cls: 0.1898, acc: 93.2158, loss_bbox: 0.2335, loss_mask: 0.2329, loss: 0.7223 2024-05-29 18:46:54,612 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 18:46:54,612 - mmdet - INFO - Epoch [6][6350/7330] lr: 1.000e-04, eta: 8:37:10, time: 0.734, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0465, loss_cls: 0.2007, acc: 92.8135, loss_bbox: 0.2460, loss_mask: 0.2390, loss: 0.7526 2024-05-29 18:47:32,120 - mmdet - INFO - Epoch [6][6400/7330] lr: 1.000e-04, eta: 8:36:39, time: 0.750, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0433, loss_cls: 0.1820, acc: 93.5198, loss_bbox: 0.2161, loss_mask: 0.2300, loss: 0.6887 2024-05-29 18:48:07,890 - mmdet - INFO - Epoch [6][6450/7330] lr: 1.000e-04, eta: 8:36:05, time: 0.716, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0460, loss_cls: 0.1871, acc: 93.2593, loss_bbox: 0.2312, loss_mask: 0.2405, loss: 0.7244 2024-05-29 18:48:42,188 - mmdet - INFO - Epoch [6][6500/7330] lr: 1.000e-04, eta: 8:35:31, time: 0.686, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0511, loss_cls: 0.1938, acc: 92.9026, loss_bbox: 0.2428, loss_mask: 0.2434, loss: 0.7534 2024-05-29 18:49:15,677 - mmdet - INFO - Epoch [6][6550/7330] lr: 1.000e-04, eta: 8:34:55, time: 0.670, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0452, loss_cls: 0.1894, acc: 93.0352, loss_bbox: 0.2354, loss_mask: 0.2388, loss: 0.7276 2024-05-29 18:49:51,327 - mmdet - INFO - Epoch [6][6600/7330] lr: 1.000e-04, eta: 8:34:22, time: 0.713, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0470, loss_cls: 0.1955, acc: 92.8035, loss_bbox: 0.2427, loss_mask: 0.2436, loss: 0.7494 2024-05-29 18:50:24,944 - mmdet - INFO - Epoch [6][6650/7330] lr: 1.000e-04, eta: 8:33:46, time: 0.672, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0462, loss_cls: 0.1896, acc: 93.2126, loss_bbox: 0.2351, loss_mask: 0.2360, loss: 0.7263 2024-05-29 18:50:58,062 - mmdet - INFO - Epoch [6][6700/7330] lr: 1.000e-04, eta: 8:33:10, time: 0.662, data_time: 0.038, memory: 11628, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0449, loss_cls: 0.1874, acc: 93.2002, loss_bbox: 0.2312, loss_mask: 0.2367, loss: 0.7206 2024-05-29 18:51:31,693 - mmdet - INFO - Epoch [6][6750/7330] lr: 1.000e-04, eta: 8:32:35, time: 0.673, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0433, loss_cls: 0.1808, acc: 93.5122, loss_bbox: 0.2262, loss_mask: 0.2366, loss: 0.7059 2024-05-29 18:52:04,768 - mmdet - INFO - Epoch [6][6800/7330] lr: 1.000e-04, eta: 8:31:59, time: 0.661, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0425, loss_cls: 0.1814, acc: 93.4351, loss_bbox: 0.2284, loss_mask: 0.2391, loss: 0.7087 2024-05-29 18:52:38,015 - mmdet - INFO - Epoch [6][6850/7330] lr: 1.000e-04, eta: 8:31:23, time: 0.665, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0460, loss_cls: 0.1892, acc: 93.1826, loss_bbox: 0.2360, loss_mask: 0.2409, loss: 0.7324 2024-05-29 18:53:11,799 - mmdet - INFO - Epoch [6][6900/7330] lr: 1.000e-04, eta: 8:30:48, time: 0.676, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0483, loss_cls: 0.1965, acc: 92.8794, loss_bbox: 0.2440, loss_mask: 0.2419, loss: 0.7524 2024-05-29 18:53:45,400 - mmdet - INFO - Epoch [6][6950/7330] lr: 1.000e-04, eta: 8:30:13, time: 0.672, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0457, loss_cls: 0.1862, acc: 93.2122, loss_bbox: 0.2340, loss_mask: 0.2372, loss: 0.7238 2024-05-29 18:54:21,042 - mmdet - INFO - Epoch [6][7000/7330] lr: 1.000e-04, eta: 8:29:39, time: 0.713, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0447, loss_cls: 0.1930, acc: 93.0442, loss_bbox: 0.2327, loss_mask: 0.2334, loss: 0.7237 2024-05-29 18:54:54,493 - mmdet - INFO - Epoch [6][7050/7330] lr: 1.000e-04, eta: 8:29:04, time: 0.669, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0468, loss_cls: 0.1971, acc: 92.8369, loss_bbox: 0.2407, loss_mask: 0.2421, loss: 0.7451 2024-05-29 18:55:31,140 - mmdet - INFO - Epoch [6][7100/7330] lr: 1.000e-04, eta: 8:28:31, time: 0.733, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0425, loss_cls: 0.1830, acc: 93.3674, loss_bbox: 0.2276, loss_mask: 0.2342, loss: 0.7049 2024-05-29 18:56:07,241 - mmdet - INFO - Epoch [6][7150/7330] lr: 1.000e-04, eta: 8:27:58, time: 0.722, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0465, loss_cls: 0.1927, acc: 93.1484, loss_bbox: 0.2380, loss_mask: 0.2378, loss: 0.7339 2024-05-29 18:56:41,755 - mmdet - INFO - Epoch [6][7200/7330] lr: 1.000e-04, eta: 8:27:24, time: 0.690, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0439, loss_cls: 0.1884, acc: 93.1409, loss_bbox: 0.2300, loss_mask: 0.2368, loss: 0.7184 2024-05-29 18:57:20,082 - mmdet - INFO - Epoch [6][7250/7330] lr: 1.000e-04, eta: 8:26:53, time: 0.766, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0489, loss_cls: 0.2004, acc: 92.7241, loss_bbox: 0.2499, loss_mask: 0.2451, loss: 0.7642 2024-05-29 18:57:55,729 - mmdet - INFO - Epoch [6][7300/7330] lr: 1.000e-04, eta: 8:26:20, time: 0.713, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0434, loss_cls: 0.1889, acc: 93.2402, loss_bbox: 0.2249, loss_mask: 0.2361, loss: 0.7126 2024-05-29 18:58:16,460 - mmdet - INFO - Saving checkpoint at 6 epochs 2024-05-29 19:00:10,591 - mmdet - INFO - Evaluating bbox... 2024-05-29 19:00:36,425 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.427 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.654 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.467 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.260 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.465 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.583 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.554 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.554 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.554 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.369 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.596 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.712 2024-05-29 19:00:36,425 - mmdet - INFO - Evaluating segm... 2024-05-29 19:01:03,215 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.389 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.626 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.413 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.181 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.424 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.505 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.505 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.505 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.304 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.552 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.678 2024-05-29 19:01:03,632 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 19:01:03,633 - mmdet - INFO - Epoch(val) [6][625] bbox_mAP: 0.4270, bbox_mAP_50: 0.6540, bbox_mAP_75: 0.4670, bbox_mAP_s: 0.2600, bbox_mAP_m: 0.4650, bbox_mAP_l: 0.5830, bbox_mAP_copypaste: 0.427 0.654 0.467 0.260 0.465 0.583, segm_mAP: 0.3890, segm_mAP_50: 0.6260, segm_mAP_75: 0.4130, segm_mAP_s: 0.1810, segm_mAP_m: 0.4240, segm_mAP_l: 0.5910, segm_mAP_copypaste: 0.389 0.626 0.413 0.181 0.424 0.591 2024-05-29 19:01:49,553 - mmdet - INFO - Epoch [7][50/7330] lr: 1.000e-04, eta: 8:25:15, time: 0.918, data_time: 0.119, memory: 11628, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0441, loss_cls: 0.1794, acc: 93.3718, loss_bbox: 0.2270, loss_mask: 0.2372, loss: 0.7046 2024-05-29 19:02:25,642 - mmdet - INFO - Epoch [7][100/7330] lr: 1.000e-04, eta: 8:24:42, time: 0.722, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0443, loss_cls: 0.1790, acc: 93.4001, loss_bbox: 0.2313, loss_mask: 0.2336, loss: 0.7052 2024-05-29 19:02:59,623 - mmdet - INFO - Epoch [7][150/7330] lr: 1.000e-04, eta: 8:24:07, time: 0.680, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0465, loss_cls: 0.1907, acc: 92.8572, loss_bbox: 0.2444, loss_mask: 0.2425, loss: 0.7427 2024-05-29 19:03:33,287 - mmdet - INFO - Epoch [7][200/7330] lr: 1.000e-04, eta: 8:23:32, time: 0.673, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0447, loss_cls: 0.1769, acc: 93.4897, loss_bbox: 0.2273, loss_mask: 0.2314, loss: 0.6979 2024-05-29 19:04:06,883 - mmdet - INFO - Epoch [7][250/7330] lr: 1.000e-04, eta: 8:22:57, time: 0.672, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0440, loss_cls: 0.1752, acc: 93.4822, loss_bbox: 0.2281, loss_mask: 0.2295, loss: 0.6935 2024-05-29 19:04:40,536 - mmdet - INFO - Epoch [7][300/7330] lr: 1.000e-04, eta: 8:22:21, time: 0.673, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0441, loss_cls: 0.1767, acc: 93.4614, loss_bbox: 0.2249, loss_mask: 0.2309, loss: 0.6940 2024-05-29 19:05:14,406 - mmdet - INFO - Epoch [7][350/7330] lr: 1.000e-04, eta: 8:21:46, time: 0.677, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0462, loss_cls: 0.1895, acc: 92.9329, loss_bbox: 0.2439, loss_mask: 0.2402, loss: 0.7374 2024-05-29 19:05:48,959 - mmdet - INFO - Epoch [7][400/7330] lr: 1.000e-04, eta: 8:21:12, time: 0.691, data_time: 0.072, memory: 11628, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0473, loss_cls: 0.1899, acc: 93.0383, loss_bbox: 0.2357, loss_mask: 0.2359, loss: 0.7294 2024-05-29 19:06:22,540 - mmdet - INFO - Epoch [7][450/7330] lr: 1.000e-04, eta: 8:20:36, time: 0.672, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0406, loss_cls: 0.1748, acc: 93.5737, loss_bbox: 0.2232, loss_mask: 0.2279, loss: 0.6830 2024-05-29 19:06:56,203 - mmdet - INFO - Epoch [7][500/7330] lr: 1.000e-04, eta: 8:20:01, time: 0.673, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0412, loss_cls: 0.1718, acc: 93.6438, loss_bbox: 0.2196, loss_mask: 0.2289, loss: 0.6797 2024-05-29 19:07:30,173 - mmdet - INFO - Epoch [7][550/7330] lr: 1.000e-04, eta: 8:19:26, time: 0.679, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0457, loss_cls: 0.1827, acc: 93.2988, loss_bbox: 0.2320, loss_mask: 0.2331, loss: 0.7112 2024-05-29 19:08:03,895 - mmdet - INFO - Epoch [7][600/7330] lr: 1.000e-04, eta: 8:18:51, time: 0.674, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0442, loss_cls: 0.1825, acc: 93.3010, loss_bbox: 0.2280, loss_mask: 0.2258, loss: 0.6978 2024-05-29 19:08:39,602 - mmdet - INFO - Epoch [7][650/7330] lr: 1.000e-04, eta: 8:18:17, time: 0.714, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0429, loss_cls: 0.1782, acc: 93.4026, loss_bbox: 0.2272, loss_mask: 0.2318, loss: 0.6973 2024-05-29 19:09:13,194 - mmdet - INFO - Epoch [7][700/7330] lr: 1.000e-04, eta: 8:17:42, time: 0.672, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0442, loss_cls: 0.1758, acc: 93.5874, loss_bbox: 0.2296, loss_mask: 0.2305, loss: 0.6975 2024-05-29 19:09:49,255 - mmdet - INFO - Epoch [7][750/7330] lr: 1.000e-04, eta: 8:17:09, time: 0.721, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0442, loss_cls: 0.1773, acc: 93.5645, loss_bbox: 0.2291, loss_mask: 0.2286, loss: 0.6965 2024-05-29 19:10:26,499 - mmdet - INFO - Epoch [7][800/7330] lr: 1.000e-04, eta: 8:16:37, time: 0.745, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0428, loss_cls: 0.1794, acc: 93.4380, loss_bbox: 0.2278, loss_mask: 0.2289, loss: 0.6966 2024-05-29 19:11:00,041 - mmdet - INFO - Epoch [7][850/7330] lr: 1.000e-04, eta: 8:16:02, time: 0.671, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0461, loss_cls: 0.1821, acc: 93.3303, loss_bbox: 0.2284, loss_mask: 0.2322, loss: 0.7075 2024-05-29 19:11:33,121 - mmdet - INFO - Epoch [7][900/7330] lr: 1.000e-04, eta: 8:15:26, time: 0.662, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0451, loss_cls: 0.1807, acc: 93.4260, loss_bbox: 0.2252, loss_mask: 0.2307, loss: 0.7005 2024-05-29 19:12:09,095 - mmdet - INFO - Epoch [7][950/7330] lr: 1.000e-04, eta: 8:14:53, time: 0.719, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0442, loss_cls: 0.1752, acc: 93.5950, loss_bbox: 0.2246, loss_mask: 0.2350, loss: 0.6969 2024-05-29 19:12:47,328 - mmdet - INFO - Epoch [7][1000/7330] lr: 1.000e-04, eta: 8:14:22, time: 0.765, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0430, loss_cls: 0.1728, acc: 93.6633, loss_bbox: 0.2214, loss_mask: 0.2271, loss: 0.6811 2024-05-29 19:13:21,067 - mmdet - INFO - Epoch [7][1050/7330] lr: 1.000e-04, eta: 8:13:47, time: 0.675, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0467, loss_cls: 0.1896, acc: 93.0530, loss_bbox: 0.2348, loss_mask: 0.2344, loss: 0.7248 2024-05-29 19:13:56,753 - mmdet - INFO - Epoch [7][1100/7330] lr: 1.000e-04, eta: 8:13:13, time: 0.714, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0444, loss_cls: 0.1798, acc: 93.4370, loss_bbox: 0.2266, loss_mask: 0.2319, loss: 0.6995 2024-05-29 19:14:30,063 - mmdet - INFO - Epoch [7][1150/7330] lr: 1.000e-04, eta: 8:12:37, time: 0.666, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0448, loss_cls: 0.1904, acc: 92.9492, loss_bbox: 0.2379, loss_mask: 0.2381, loss: 0.7296 2024-05-29 19:15:06,284 - mmdet - INFO - Epoch [7][1200/7330] lr: 1.000e-04, eta: 8:12:05, time: 0.724, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0453, loss_cls: 0.1816, acc: 93.2966, loss_bbox: 0.2327, loss_mask: 0.2319, loss: 0.7106 2024-05-29 19:15:39,283 - mmdet - INFO - Epoch [7][1250/7330] lr: 1.000e-04, eta: 8:11:29, time: 0.660, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0470, loss_cls: 0.1886, acc: 93.0354, loss_bbox: 0.2355, loss_mask: 0.2388, loss: 0.7279 2024-05-29 19:16:11,856 - mmdet - INFO - Epoch [7][1300/7330] lr: 1.000e-04, eta: 8:10:52, time: 0.651, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0419, loss_cls: 0.1675, acc: 93.9409, loss_bbox: 0.2163, loss_mask: 0.2319, loss: 0.6736 2024-05-29 19:16:45,454 - mmdet - INFO - Epoch [7][1350/7330] lr: 1.000e-04, eta: 8:10:17, time: 0.672, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0443, loss_cls: 0.1808, acc: 93.3206, loss_bbox: 0.2315, loss_mask: 0.2385, loss: 0.7135 2024-05-29 19:17:19,874 - mmdet - INFO - Epoch [7][1400/7330] lr: 1.000e-04, eta: 8:09:42, time: 0.688, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0499, loss_cls: 0.1901, acc: 92.9822, loss_bbox: 0.2419, loss_mask: 0.2386, loss: 0.7423 2024-05-29 19:17:53,309 - mmdet - INFO - Epoch [7][1450/7330] lr: 1.000e-04, eta: 8:09:07, time: 0.669, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0427, loss_cls: 0.1877, acc: 93.1538, loss_bbox: 0.2341, loss_mask: 0.2362, loss: 0.7178 2024-05-29 19:18:26,578 - mmdet - INFO - Epoch [7][1500/7330] lr: 1.000e-04, eta: 8:08:31, time: 0.665, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0445, loss_cls: 0.1807, acc: 93.3667, loss_bbox: 0.2252, loss_mask: 0.2296, loss: 0.6980 2024-05-29 19:19:02,883 - mmdet - INFO - Epoch [7][1550/7330] lr: 1.000e-04, eta: 8:07:58, time: 0.726, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0444, loss_cls: 0.1869, acc: 93.2488, loss_bbox: 0.2333, loss_mask: 0.2373, loss: 0.7198 2024-05-29 19:19:36,462 - mmdet - INFO - Epoch [7][1600/7330] lr: 1.000e-04, eta: 8:07:23, time: 0.672, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0460, loss_cls: 0.1872, acc: 93.0762, loss_bbox: 0.2381, loss_mask: 0.2390, loss: 0.7296 2024-05-29 19:20:16,366 - mmdet - INFO - Epoch [7][1650/7330] lr: 1.000e-04, eta: 8:06:54, time: 0.798, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0449, loss_cls: 0.1816, acc: 93.3850, loss_bbox: 0.2273, loss_mask: 0.2289, loss: 0.6993 2024-05-29 19:20:49,948 - mmdet - INFO - Epoch [7][1700/7330] lr: 1.000e-04, eta: 8:06:18, time: 0.672, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0439, loss_cls: 0.1829, acc: 93.3433, loss_bbox: 0.2299, loss_mask: 0.2320, loss: 0.7072 2024-05-29 19:21:23,788 - mmdet - INFO - Epoch [7][1750/7330] lr: 1.000e-04, eta: 8:05:43, time: 0.677, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0426, loss_cls: 0.1778, acc: 93.4580, loss_bbox: 0.2258, loss_mask: 0.2344, loss: 0.6983 2024-05-29 19:22:00,659 - mmdet - INFO - Epoch [7][1800/7330] lr: 1.000e-04, eta: 8:05:11, time: 0.737, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0472, loss_cls: 0.1874, acc: 93.1152, loss_bbox: 0.2373, loss_mask: 0.2391, loss: 0.7294 2024-05-29 19:22:34,088 - mmdet - INFO - Epoch [7][1850/7330] lr: 1.000e-04, eta: 8:04:35, time: 0.669, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0397, loss_cls: 0.1640, acc: 93.9490, loss_bbox: 0.2132, loss_mask: 0.2300, loss: 0.6626 2024-05-29 19:23:12,551 - mmdet - INFO - Epoch [7][1900/7330] lr: 1.000e-04, eta: 8:04:04, time: 0.769, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0444, loss_cls: 0.1841, acc: 93.3284, loss_bbox: 0.2290, loss_mask: 0.2323, loss: 0.7083 2024-05-29 19:23:46,029 - mmdet - INFO - Epoch [7][1950/7330] lr: 1.000e-04, eta: 8:03:29, time: 0.669, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0431, loss_cls: 0.1824, acc: 93.2559, loss_bbox: 0.2345, loss_mask: 0.2357, loss: 0.7136 2024-05-29 19:24:22,168 - mmdet - INFO - Epoch [7][2000/7330] lr: 1.000e-04, eta: 8:02:56, time: 0.723, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0462, loss_cls: 0.1826, acc: 93.3428, loss_bbox: 0.2326, loss_mask: 0.2306, loss: 0.7100 2024-05-29 19:24:55,995 - mmdet - INFO - Epoch [7][2050/7330] lr: 1.000e-04, eta: 8:02:21, time: 0.676, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0425, loss_cls: 0.1802, acc: 93.3901, loss_bbox: 0.2265, loss_mask: 0.2278, loss: 0.6948 2024-05-29 19:25:32,485 - mmdet - INFO - Epoch [7][2100/7330] lr: 1.000e-04, eta: 8:01:48, time: 0.730, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0410, loss_cls: 0.1744, acc: 93.5962, loss_bbox: 0.2252, loss_mask: 0.2375, loss: 0.6937 2024-05-29 19:26:05,733 - mmdet - INFO - Epoch [7][2150/7330] lr: 1.000e-04, eta: 8:01:12, time: 0.665, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0431, loss_cls: 0.1804, acc: 93.3108, loss_bbox: 0.2260, loss_mask: 0.2342, loss: 0.7004 2024-05-29 19:26:38,902 - mmdet - INFO - Epoch [7][2200/7330] lr: 1.000e-04, eta: 8:00:37, time: 0.663, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0422, loss_cls: 0.1776, acc: 93.4321, loss_bbox: 0.2293, loss_mask: 0.2359, loss: 0.7021 2024-05-29 19:27:11,807 - mmdet - INFO - Epoch [7][2250/7330] lr: 1.000e-04, eta: 8:00:01, time: 0.658, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0423, loss_cls: 0.1715, acc: 93.7490, loss_bbox: 0.2211, loss_mask: 0.2316, loss: 0.6830 2024-05-29 19:27:45,441 - mmdet - INFO - Epoch [7][2300/7330] lr: 1.000e-04, eta: 7:59:25, time: 0.673, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0444, loss_cls: 0.1791, acc: 93.3645, loss_bbox: 0.2247, loss_mask: 0.2329, loss: 0.7003 2024-05-29 19:28:18,775 - mmdet - INFO - Epoch [7][2350/7330] lr: 1.000e-04, eta: 7:58:50, time: 0.667, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0445, loss_cls: 0.1795, acc: 93.4558, loss_bbox: 0.2291, loss_mask: 0.2349, loss: 0.7068 2024-05-29 19:28:54,983 - mmdet - INFO - Epoch [7][2400/7330] lr: 1.000e-04, eta: 7:58:17, time: 0.724, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0437, loss_cls: 0.1728, acc: 93.5308, loss_bbox: 0.2233, loss_mask: 0.2304, loss: 0.6879 2024-05-29 19:29:28,897 - mmdet - INFO - Epoch [7][2450/7330] lr: 1.000e-04, eta: 7:57:42, time: 0.678, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0448, loss_cls: 0.1811, acc: 93.3359, loss_bbox: 0.2301, loss_mask: 0.2320, loss: 0.7050 2024-05-29 19:30:02,253 - mmdet - INFO - Epoch [7][2500/7330] lr: 1.000e-04, eta: 7:57:06, time: 0.667, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0482, loss_cls: 0.1941, acc: 92.9353, loss_bbox: 0.2408, loss_mask: 0.2417, loss: 0.7453 2024-05-29 19:30:40,184 - mmdet - INFO - Epoch [7][2550/7330] lr: 1.000e-04, eta: 7:56:35, time: 0.759, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0455, loss_cls: 0.1753, acc: 93.6812, loss_bbox: 0.2215, loss_mask: 0.2279, loss: 0.6865 2024-05-29 19:31:14,361 - mmdet - INFO - Epoch [7][2600/7330] lr: 1.000e-04, eta: 7:56:00, time: 0.683, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0439, loss_cls: 0.1890, acc: 93.2434, loss_bbox: 0.2310, loss_mask: 0.2370, loss: 0.7208 2024-05-29 19:31:48,281 - mmdet - INFO - Epoch [7][2650/7330] lr: 1.000e-04, eta: 7:55:25, time: 0.678, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0457, loss_cls: 0.1843, acc: 93.2637, loss_bbox: 0.2293, loss_mask: 0.2384, loss: 0.7172 2024-05-29 19:32:23,883 - mmdet - INFO - Epoch [7][2700/7330] lr: 1.000e-04, eta: 7:54:51, time: 0.712, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0454, loss_cls: 0.1810, acc: 93.3394, loss_bbox: 0.2282, loss_mask: 0.2333, loss: 0.7058 2024-05-29 19:32:59,292 - mmdet - INFO - Epoch [7][2750/7330] lr: 1.000e-04, eta: 7:54:18, time: 0.709, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0450, loss_cls: 0.1813, acc: 93.1790, loss_bbox: 0.2345, loss_mask: 0.2373, loss: 0.7171 2024-05-29 19:33:34,616 - mmdet - INFO - Epoch [7][2800/7330] lr: 1.000e-04, eta: 7:53:44, time: 0.707, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0437, loss_cls: 0.1841, acc: 93.3179, loss_bbox: 0.2305, loss_mask: 0.2346, loss: 0.7114 2024-05-29 19:34:07,792 - mmdet - INFO - Epoch [7][2850/7330] lr: 1.000e-04, eta: 7:53:08, time: 0.663, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0438, loss_cls: 0.1826, acc: 93.2539, loss_bbox: 0.2319, loss_mask: 0.2352, loss: 0.7119 2024-05-29 19:34:43,191 - mmdet - INFO - Epoch [7][2900/7330] lr: 1.000e-04, eta: 7:52:34, time: 0.708, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0454, loss_cls: 0.1829, acc: 93.1819, loss_bbox: 0.2336, loss_mask: 0.2368, loss: 0.7153 2024-05-29 19:35:17,088 - mmdet - INFO - Epoch [7][2950/7330] lr: 1.000e-04, eta: 7:51:59, time: 0.678, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0424, loss_cls: 0.1834, acc: 93.2837, loss_bbox: 0.2267, loss_mask: 0.2270, loss: 0.6964 2024-05-29 19:35:52,741 - mmdet - INFO - Epoch [7][3000/7330] lr: 1.000e-04, eta: 7:51:26, time: 0.713, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0451, loss_cls: 0.1820, acc: 93.3564, loss_bbox: 0.2271, loss_mask: 0.2343, loss: 0.7079 2024-05-29 19:36:26,332 - mmdet - INFO - Epoch [7][3050/7330] lr: 1.000e-04, eta: 7:50:51, time: 0.672, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0443, loss_cls: 0.1835, acc: 93.2397, loss_bbox: 0.2353, loss_mask: 0.2371, loss: 0.7201 2024-05-29 19:37:00,288 - mmdet - INFO - Epoch [7][3100/7330] lr: 1.000e-04, eta: 7:50:16, time: 0.679, data_time: 0.062, memory: 11628, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0474, loss_cls: 0.1921, acc: 92.9583, loss_bbox: 0.2415, loss_mask: 0.2429, loss: 0.7428 2024-05-29 19:37:33,585 - mmdet - INFO - Epoch [7][3150/7330] lr: 1.000e-04, eta: 7:49:40, time: 0.666, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0438, loss_cls: 0.1765, acc: 93.5574, loss_bbox: 0.2240, loss_mask: 0.2311, loss: 0.6931 2024-05-29 19:38:07,692 - mmdet - INFO - Epoch [7][3200/7330] lr: 1.000e-04, eta: 7:49:05, time: 0.682, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0446, loss_cls: 0.1869, acc: 93.0857, loss_bbox: 0.2369, loss_mask: 0.2371, loss: 0.7240 2024-05-29 19:38:41,820 - mmdet - INFO - Epoch [7][3250/7330] lr: 1.000e-04, eta: 7:48:30, time: 0.683, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0450, loss_cls: 0.1830, acc: 93.3167, loss_bbox: 0.2302, loss_mask: 0.2354, loss: 0.7123 2024-05-29 19:39:18,387 - mmdet - INFO - Epoch [7][3300/7330] lr: 1.000e-04, eta: 7:47:58, time: 0.731, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0421, loss_cls: 0.1763, acc: 93.5417, loss_bbox: 0.2205, loss_mask: 0.2319, loss: 0.6869 2024-05-29 19:39:52,522 - mmdet - INFO - Epoch [7][3350/7330] lr: 1.000e-04, eta: 7:47:23, time: 0.682, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0455, loss_cls: 0.1883, acc: 93.1899, loss_bbox: 0.2308, loss_mask: 0.2369, loss: 0.7195 2024-05-29 19:40:28,090 - mmdet - INFO - Epoch [7][3400/7330] lr: 1.000e-04, eta: 7:46:49, time: 0.712, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0417, loss_cls: 0.1759, acc: 93.5869, loss_bbox: 0.2215, loss_mask: 0.2305, loss: 0.6875 2024-05-29 19:41:04,171 - mmdet - INFO - Epoch [7][3450/7330] lr: 1.000e-04, eta: 7:46:16, time: 0.722, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0450, loss_cls: 0.1857, acc: 93.1787, loss_bbox: 0.2302, loss_mask: 0.2338, loss: 0.7140 2024-05-29 19:41:37,800 - mmdet - INFO - Epoch [7][3500/7330] lr: 1.000e-04, eta: 7:45:41, time: 0.673, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0441, loss_cls: 0.1823, acc: 93.3762, loss_bbox: 0.2346, loss_mask: 0.2353, loss: 0.7153 2024-05-29 19:42:11,054 - mmdet - INFO - Epoch [7][3550/7330] lr: 1.000e-04, eta: 7:45:05, time: 0.665, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0420, loss_cls: 0.1773, acc: 93.4688, loss_bbox: 0.2218, loss_mask: 0.2321, loss: 0.6900 2024-05-29 19:42:47,340 - mmdet - INFO - Epoch [7][3600/7330] lr: 1.000e-04, eta: 7:44:32, time: 0.726, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0473, loss_cls: 0.1883, acc: 93.1152, loss_bbox: 0.2360, loss_mask: 0.2401, loss: 0.7298 2024-05-29 19:43:23,582 - mmdet - INFO - Epoch [7][3650/7330] lr: 1.000e-04, eta: 7:43:59, time: 0.725, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0442, loss_cls: 0.1846, acc: 93.2385, loss_bbox: 0.2351, loss_mask: 0.2353, loss: 0.7200 2024-05-29 19:44:00,483 - mmdet - INFO - Epoch [7][3700/7330] lr: 1.000e-04, eta: 7:43:27, time: 0.738, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0476, loss_cls: 0.1929, acc: 92.9570, loss_bbox: 0.2382, loss_mask: 0.2454, loss: 0.7441 2024-05-29 19:44:36,675 - mmdet - INFO - Epoch [7][3750/7330] lr: 1.000e-04, eta: 7:42:54, time: 0.724, data_time: 0.064, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0452, loss_cls: 0.1852, acc: 93.1787, loss_bbox: 0.2347, loss_mask: 0.2363, loss: 0.7213 2024-05-29 19:45:10,323 - mmdet - INFO - Epoch [7][3800/7330] lr: 1.000e-04, eta: 7:42:18, time: 0.673, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0446, loss_cls: 0.1857, acc: 93.3528, loss_bbox: 0.2288, loss_mask: 0.2367, loss: 0.7132 2024-05-29 19:45:46,751 - mmdet - INFO - Epoch [7][3850/7330] lr: 1.000e-04, eta: 7:41:45, time: 0.728, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0476, loss_cls: 0.1940, acc: 92.9448, loss_bbox: 0.2357, loss_mask: 0.2365, loss: 0.7362 2024-05-29 19:46:20,541 - mmdet - INFO - Epoch [7][3900/7330] lr: 1.000e-04, eta: 7:41:10, time: 0.676, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0455, loss_cls: 0.1872, acc: 93.1953, loss_bbox: 0.2330, loss_mask: 0.2362, loss: 0.7215 2024-05-29 19:46:53,832 - mmdet - INFO - Epoch [7][3950/7330] lr: 1.000e-04, eta: 7:40:35, time: 0.666, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0421, loss_cls: 0.1785, acc: 93.4558, loss_bbox: 0.2209, loss_mask: 0.2344, loss: 0.6942 2024-05-29 19:47:27,920 - mmdet - INFO - Epoch [7][4000/7330] lr: 1.000e-04, eta: 7:40:00, time: 0.682, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0440, loss_cls: 0.1887, acc: 93.1609, loss_bbox: 0.2398, loss_mask: 0.2398, loss: 0.7298 2024-05-29 19:48:01,264 - mmdet - INFO - Epoch [7][4050/7330] lr: 1.000e-04, eta: 7:39:24, time: 0.667, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0465, loss_cls: 0.1807, acc: 93.2979, loss_bbox: 0.2348, loss_mask: 0.2391, loss: 0.7190 2024-05-29 19:48:34,393 - mmdet - INFO - Epoch [7][4100/7330] lr: 1.000e-04, eta: 7:38:49, time: 0.662, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0464, loss_cls: 0.1856, acc: 93.2825, loss_bbox: 0.2297, loss_mask: 0.2351, loss: 0.7163 2024-05-29 19:49:07,904 - mmdet - INFO - Epoch [7][4150/7330] lr: 1.000e-04, eta: 7:38:13, time: 0.670, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0412, loss_cls: 0.1804, acc: 93.5503, loss_bbox: 0.2194, loss_mask: 0.2229, loss: 0.6827 2024-05-29 19:49:43,285 - mmdet - INFO - Epoch [7][4200/7330] lr: 1.000e-04, eta: 7:37:39, time: 0.707, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0422, loss_cls: 0.1747, acc: 93.6162, loss_bbox: 0.2186, loss_mask: 0.2301, loss: 0.6832 2024-05-29 19:50:16,739 - mmdet - INFO - Epoch [7][4250/7330] lr: 1.000e-04, eta: 7:37:04, time: 0.669, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0432, loss_cls: 0.1760, acc: 93.5295, loss_bbox: 0.2249, loss_mask: 0.2267, loss: 0.6883 2024-05-29 19:50:52,354 - mmdet - INFO - Epoch [7][4300/7330] lr: 1.000e-04, eta: 7:36:30, time: 0.712, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0464, loss_cls: 0.1858, acc: 93.1929, loss_bbox: 0.2351, loss_mask: 0.2380, loss: 0.7246 2024-05-29 19:51:28,299 - mmdet - INFO - Epoch [7][4350/7330] lr: 1.000e-04, eta: 7:35:57, time: 0.719, data_time: 0.066, memory: 11628, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0437, loss_cls: 0.1904, acc: 92.9502, loss_bbox: 0.2373, loss_mask: 0.2377, loss: 0.7275 2024-05-29 19:52:02,173 - mmdet - INFO - Epoch [7][4400/7330] lr: 1.000e-04, eta: 7:35:22, time: 0.678, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0460, loss_cls: 0.1916, acc: 93.1196, loss_bbox: 0.2376, loss_mask: 0.2380, loss: 0.7327 2024-05-29 19:52:35,585 - mmdet - INFO - Epoch [7][4450/7330] lr: 1.000e-04, eta: 7:34:47, time: 0.668, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0435, loss_cls: 0.1777, acc: 93.4460, loss_bbox: 0.2259, loss_mask: 0.2342, loss: 0.6974 2024-05-29 19:53:11,493 - mmdet - INFO - Epoch [7][4500/7330] lr: 1.000e-04, eta: 7:34:13, time: 0.718, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0440, loss_cls: 0.1795, acc: 93.3796, loss_bbox: 0.2249, loss_mask: 0.2337, loss: 0.7014 2024-05-29 19:53:49,554 - mmdet - INFO - Epoch [7][4550/7330] lr: 1.000e-04, eta: 7:33:42, time: 0.761, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0464, loss_cls: 0.1871, acc: 93.0745, loss_bbox: 0.2397, loss_mask: 0.2388, loss: 0.7311 2024-05-29 19:54:23,285 - mmdet - INFO - Epoch [7][4600/7330] lr: 1.000e-04, eta: 7:33:06, time: 0.675, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0467, loss_cls: 0.1907, acc: 93.0349, loss_bbox: 0.2416, loss_mask: 0.2367, loss: 0.7342 2024-05-29 19:54:59,806 - mmdet - INFO - Epoch [7][4650/7330] lr: 1.000e-04, eta: 7:32:34, time: 0.730, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0464, loss_cls: 0.1871, acc: 93.1594, loss_bbox: 0.2308, loss_mask: 0.2397, loss: 0.7231 2024-05-29 19:55:33,795 - mmdet - INFO - Epoch [7][4700/7330] lr: 1.000e-04, eta: 7:31:59, time: 0.680, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0459, loss_cls: 0.1862, acc: 93.2607, loss_bbox: 0.2301, loss_mask: 0.2401, loss: 0.7205 2024-05-29 19:56:10,026 - mmdet - INFO - Epoch [7][4750/7330] lr: 1.000e-04, eta: 7:31:26, time: 0.725, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0475, loss_cls: 0.1934, acc: 92.9761, loss_bbox: 0.2430, loss_mask: 0.2418, loss: 0.7453 2024-05-29 19:56:43,802 - mmdet - INFO - Epoch [7][4800/7330] lr: 1.000e-04, eta: 7:30:50, time: 0.676, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0414, loss_cls: 0.1719, acc: 93.6953, loss_bbox: 0.2151, loss_mask: 0.2256, loss: 0.6705 2024-05-29 19:57:17,077 - mmdet - INFO - Epoch [7][4850/7330] lr: 1.000e-04, eta: 7:30:15, time: 0.665, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0440, loss_cls: 0.1889, acc: 93.0779, loss_bbox: 0.2365, loss_mask: 0.2340, loss: 0.7221 2024-05-29 19:57:50,169 - mmdet - INFO - Epoch [7][4900/7330] lr: 1.000e-04, eta: 7:29:39, time: 0.662, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0433, loss_cls: 0.1886, acc: 93.2249, loss_bbox: 0.2314, loss_mask: 0.2373, loss: 0.7196 2024-05-29 19:58:23,019 - mmdet - INFO - Epoch [7][4950/7330] lr: 1.000e-04, eta: 7:29:03, time: 0.657, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0441, loss_cls: 0.1802, acc: 93.4041, loss_bbox: 0.2261, loss_mask: 0.2352, loss: 0.7039 2024-05-29 19:58:56,542 - mmdet - INFO - Epoch [7][5000/7330] lr: 1.000e-04, eta: 7:28:28, time: 0.670, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0471, loss_cls: 0.1867, acc: 93.1531, loss_bbox: 0.2328, loss_mask: 0.2369, loss: 0.7217 2024-05-29 19:59:33,555 - mmdet - INFO - Epoch [7][5050/7330] lr: 1.000e-04, eta: 7:27:56, time: 0.740, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0458, loss_cls: 0.1845, acc: 93.2095, loss_bbox: 0.2296, loss_mask: 0.2289, loss: 0.7074 2024-05-29 20:00:07,373 - mmdet - INFO - Epoch [7][5100/7330] lr: 1.000e-04, eta: 7:27:20, time: 0.676, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0461, loss_cls: 0.1880, acc: 93.1643, loss_bbox: 0.2365, loss_mask: 0.2351, loss: 0.7250 2024-05-29 20:00:41,044 - mmdet - INFO - Epoch [7][5150/7330] lr: 1.000e-04, eta: 7:26:45, time: 0.673, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0437, loss_cls: 0.1881, acc: 93.1602, loss_bbox: 0.2304, loss_mask: 0.2366, loss: 0.7168 2024-05-29 20:01:19,261 - mmdet - INFO - Epoch [7][5200/7330] lr: 1.000e-04, eta: 7:26:14, time: 0.764, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0435, loss_cls: 0.1871, acc: 93.1907, loss_bbox: 0.2345, loss_mask: 0.2349, loss: 0.7189 2024-05-29 20:01:52,837 - mmdet - INFO - Epoch [7][5250/7330] lr: 1.000e-04, eta: 7:25:38, time: 0.672, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0435, loss_cls: 0.1770, acc: 93.5493, loss_bbox: 0.2241, loss_mask: 0.2356, loss: 0.6969 2024-05-29 20:02:26,215 - mmdet - INFO - Epoch [7][5300/7330] lr: 1.000e-04, eta: 7:25:03, time: 0.668, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0451, loss_cls: 0.1777, acc: 93.4788, loss_bbox: 0.2245, loss_mask: 0.2327, loss: 0.6981 2024-05-29 20:02:59,735 - mmdet - INFO - Epoch [7][5350/7330] lr: 1.000e-04, eta: 7:24:28, time: 0.670, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0479, loss_cls: 0.1826, acc: 93.2063, loss_bbox: 0.2364, loss_mask: 0.2371, loss: 0.7245 2024-05-29 20:03:35,761 - mmdet - INFO - Epoch [7][5400/7330] lr: 1.000e-04, eta: 7:23:54, time: 0.721, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0450, loss_cls: 0.1909, acc: 93.0754, loss_bbox: 0.2321, loss_mask: 0.2302, loss: 0.7174 2024-05-29 20:04:14,211 - mmdet - INFO - Epoch [7][5450/7330] lr: 1.000e-04, eta: 7:23:23, time: 0.769, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0437, loss_cls: 0.1822, acc: 93.3289, loss_bbox: 0.2277, loss_mask: 0.2320, loss: 0.7052 2024-05-29 20:04:47,508 - mmdet - INFO - Epoch [7][5500/7330] lr: 1.000e-04, eta: 7:22:47, time: 0.666, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0428, loss_cls: 0.1839, acc: 93.2639, loss_bbox: 0.2275, loss_mask: 0.2355, loss: 0.7072 2024-05-29 20:05:22,958 - mmdet - INFO - Epoch [7][5550/7330] lr: 1.000e-04, eta: 7:22:14, time: 0.709, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0435, loss_cls: 0.1905, acc: 93.1963, loss_bbox: 0.2324, loss_mask: 0.2341, loss: 0.7184 2024-05-29 20:05:56,294 - mmdet - INFO - Epoch [7][5600/7330] lr: 1.000e-04, eta: 7:21:38, time: 0.667, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0436, loss_cls: 0.1853, acc: 93.2180, loss_bbox: 0.2330, loss_mask: 0.2372, loss: 0.7174 2024-05-29 20:06:32,233 - mmdet - INFO - Epoch [7][5650/7330] lr: 1.000e-04, eta: 7:21:05, time: 0.719, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0442, loss_cls: 0.1758, acc: 93.5361, loss_bbox: 0.2220, loss_mask: 0.2336, loss: 0.6954 2024-05-29 20:07:05,720 - mmdet - INFO - Epoch [7][5700/7330] lr: 1.000e-04, eta: 7:20:29, time: 0.670, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0454, loss_cls: 0.1870, acc: 93.1187, loss_bbox: 0.2333, loss_mask: 0.2377, loss: 0.7232 2024-05-29 20:07:38,997 - mmdet - INFO - Epoch [7][5750/7330] lr: 1.000e-04, eta: 7:19:54, time: 0.666, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0423, loss_cls: 0.1869, acc: 93.1201, loss_bbox: 0.2336, loss_mask: 0.2374, loss: 0.7192 2024-05-29 20:08:12,254 - mmdet - INFO - Epoch [7][5800/7330] lr: 1.000e-04, eta: 7:19:18, time: 0.665, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0441, loss_cls: 0.1841, acc: 93.3037, loss_bbox: 0.2282, loss_mask: 0.2280, loss: 0.7037 2024-05-29 20:08:46,668 - mmdet - INFO - Epoch [7][5850/7330] lr: 1.000e-04, eta: 7:18:44, time: 0.688, data_time: 0.068, memory: 11628, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0457, loss_cls: 0.1835, acc: 93.3301, loss_bbox: 0.2298, loss_mask: 0.2298, loss: 0.7073 2024-05-29 20:09:20,357 - mmdet - INFO - Epoch [7][5900/7330] lr: 1.000e-04, eta: 7:18:09, time: 0.673, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0420, loss_cls: 0.1809, acc: 93.3911, loss_bbox: 0.2283, loss_mask: 0.2324, loss: 0.7015 2024-05-29 20:09:57,767 - mmdet - INFO - Epoch [7][5950/7330] lr: 1.000e-04, eta: 7:17:36, time: 0.748, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0441, loss_cls: 0.1848, acc: 93.2573, loss_bbox: 0.2290, loss_mask: 0.2327, loss: 0.7072 2024-05-29 20:10:31,085 - mmdet - INFO - Epoch [7][6000/7330] lr: 1.000e-04, eta: 7:17:01, time: 0.666, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0463, loss_cls: 0.1874, acc: 93.2209, loss_bbox: 0.2303, loss_mask: 0.2334, loss: 0.7176 2024-05-29 20:11:04,540 - mmdet - INFO - Epoch [7][6050/7330] lr: 1.000e-04, eta: 7:16:26, time: 0.669, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0443, loss_cls: 0.1857, acc: 93.1558, loss_bbox: 0.2360, loss_mask: 0.2431, loss: 0.7283 2024-05-29 20:11:42,620 - mmdet - INFO - Epoch [7][6100/7330] lr: 1.000e-04, eta: 7:15:54, time: 0.762, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0454, loss_cls: 0.1823, acc: 93.3538, loss_bbox: 0.2267, loss_mask: 0.2342, loss: 0.7070 2024-05-29 20:12:16,279 - mmdet - INFO - Epoch [7][6150/7330] lr: 1.000e-04, eta: 7:15:19, time: 0.673, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0445, loss_cls: 0.1861, acc: 93.2922, loss_bbox: 0.2250, loss_mask: 0.2346, loss: 0.7091 2024-05-29 20:12:49,377 - mmdet - INFO - Epoch [7][6200/7330] lr: 1.000e-04, eta: 7:14:43, time: 0.662, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0437, loss_cls: 0.1853, acc: 93.2932, loss_bbox: 0.2259, loss_mask: 0.2337, loss: 0.7056 2024-05-29 20:13:26,108 - mmdet - INFO - Epoch [7][6250/7330] lr: 1.000e-04, eta: 7:14:10, time: 0.735, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0456, loss_cls: 0.1826, acc: 93.2957, loss_bbox: 0.2310, loss_mask: 0.2354, loss: 0.7143 2024-05-29 20:13:59,926 - mmdet - INFO - Epoch [7][6300/7330] lr: 1.000e-04, eta: 7:13:35, time: 0.676, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0447, loss_cls: 0.1853, acc: 93.2092, loss_bbox: 0.2314, loss_mask: 0.2346, loss: 0.7143 2024-05-29 20:14:38,286 - mmdet - INFO - Epoch [7][6350/7330] lr: 1.000e-04, eta: 7:13:03, time: 0.767, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0459, loss_cls: 0.1875, acc: 93.1101, loss_bbox: 0.2308, loss_mask: 0.2390, loss: 0.7219 2024-05-29 20:15:11,748 - mmdet - INFO - Epoch [7][6400/7330] lr: 1.000e-04, eta: 7:12:28, time: 0.669, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0439, loss_cls: 0.1873, acc: 93.1064, loss_bbox: 0.2353, loss_mask: 0.2384, loss: 0.7239 2024-05-29 20:15:47,350 - mmdet - INFO - Epoch [7][6450/7330] lr: 1.000e-04, eta: 7:11:54, time: 0.712, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0427, loss_cls: 0.1836, acc: 93.3906, loss_bbox: 0.2284, loss_mask: 0.2346, loss: 0.7075 2024-05-29 20:16:20,767 - mmdet - INFO - Epoch [7][6500/7330] lr: 1.000e-04, eta: 7:11:19, time: 0.668, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0456, loss_cls: 0.1931, acc: 92.8596, loss_bbox: 0.2398, loss_mask: 0.2383, loss: 0.7359 2024-05-29 20:16:56,378 - mmdet - INFO - Epoch [7][6550/7330] lr: 1.000e-04, eta: 7:10:45, time: 0.712, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0450, loss_cls: 0.1851, acc: 93.3130, loss_bbox: 0.2282, loss_mask: 0.2346, loss: 0.7110 2024-05-29 20:17:30,031 - mmdet - INFO - Epoch [7][6600/7330] lr: 1.000e-04, eta: 7:10:10, time: 0.673, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0446, loss_cls: 0.1824, acc: 93.2993, loss_bbox: 0.2296, loss_mask: 0.2389, loss: 0.7137 2024-05-29 20:18:03,799 - mmdet - INFO - Epoch [7][6650/7330] lr: 1.000e-04, eta: 7:09:35, time: 0.675, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0470, loss_cls: 0.1878, acc: 93.1294, loss_bbox: 0.2358, loss_mask: 0.2425, loss: 0.7322 2024-05-29 20:18:36,890 - mmdet - INFO - Epoch [7][6700/7330] lr: 1.000e-04, eta: 7:08:59, time: 0.662, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0441, loss_cls: 0.1874, acc: 93.1960, loss_bbox: 0.2338, loss_mask: 0.2360, loss: 0.7191 2024-05-29 20:19:10,300 - mmdet - INFO - Epoch [7][6750/7330] lr: 1.000e-04, eta: 7:08:24, time: 0.669, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0424, loss_cls: 0.1785, acc: 93.4863, loss_bbox: 0.2241, loss_mask: 0.2294, loss: 0.6918 2024-05-29 20:19:44,049 - mmdet - INFO - Epoch [7][6800/7330] lr: 1.000e-04, eta: 7:07:49, time: 0.675, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0462, loss_cls: 0.1851, acc: 93.2449, loss_bbox: 0.2320, loss_mask: 0.2343, loss: 0.7168 2024-05-29 20:20:19,812 - mmdet - INFO - Epoch [7][6850/7330] lr: 1.000e-04, eta: 7:07:15, time: 0.715, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0461, loss_cls: 0.1883, acc: 93.2185, loss_bbox: 0.2301, loss_mask: 0.2344, loss: 0.7178 2024-05-29 20:20:53,396 - mmdet - INFO - Epoch [7][6900/7330] lr: 1.000e-04, eta: 7:06:40, time: 0.672, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0463, loss_cls: 0.1874, acc: 93.1340, loss_bbox: 0.2372, loss_mask: 0.2352, loss: 0.7260 2024-05-29 20:21:26,885 - mmdet - INFO - Epoch [7][6950/7330] lr: 1.000e-04, eta: 7:06:05, time: 0.670, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0418, loss_cls: 0.1797, acc: 93.4500, loss_bbox: 0.2207, loss_mask: 0.2289, loss: 0.6882 2024-05-29 20:22:05,068 - mmdet - INFO - Epoch [7][7000/7330] lr: 1.000e-04, eta: 7:05:33, time: 0.764, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0435, loss_cls: 0.1818, acc: 93.3818, loss_bbox: 0.2273, loss_mask: 0.2302, loss: 0.6991 2024-05-29 20:22:39,351 - mmdet - INFO - Epoch [7][7050/7330] lr: 1.000e-04, eta: 7:04:58, time: 0.686, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0469, loss_cls: 0.1892, acc: 93.0134, loss_bbox: 0.2409, loss_mask: 0.2379, loss: 0.7353 2024-05-29 20:23:12,807 - mmdet - INFO - Epoch [7][7100/7330] lr: 1.000e-04, eta: 7:04:23, time: 0.669, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0464, loss_cls: 0.1918, acc: 92.9688, loss_bbox: 0.2398, loss_mask: 0.2408, loss: 0.7392 2024-05-29 20:23:49,916 - mmdet - INFO - Epoch [7][7150/7330] lr: 1.000e-04, eta: 7:03:50, time: 0.743, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0440, loss_cls: 0.1825, acc: 93.2778, loss_bbox: 0.2287, loss_mask: 0.2337, loss: 0.7058 2024-05-29 20:24:26,684 - mmdet - INFO - Epoch [7][7200/7330] lr: 1.000e-04, eta: 7:03:18, time: 0.735, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0444, loss_cls: 0.1940, acc: 92.9988, loss_bbox: 0.2393, loss_mask: 0.2390, loss: 0.7367 2024-05-29 20:25:03,172 - mmdet - INFO - Epoch [7][7250/7330] lr: 1.000e-04, eta: 7:02:44, time: 0.730, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0494, loss_cls: 0.1883, acc: 93.0894, loss_bbox: 0.2371, loss_mask: 0.2411, loss: 0.7346 2024-05-29 20:25:39,101 - mmdet - INFO - Epoch [7][7300/7330] lr: 1.000e-04, eta: 7:02:11, time: 0.719, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0432, loss_cls: 0.1822, acc: 93.3752, loss_bbox: 0.2322, loss_mask: 0.2329, loss: 0.7085 2024-05-29 20:25:59,523 - mmdet - INFO - Saving checkpoint at 7 epochs 2024-05-29 20:27:58,812 - mmdet - INFO - Evaluating bbox... 2024-05-29 20:28:22,754 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.431 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.658 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.470 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.251 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.474 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.587 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.558 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.558 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.558 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.365 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.603 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.717 2024-05-29 20:28:22,755 - mmdet - INFO - Evaluating segm... 2024-05-29 20:28:51,945 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.392 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.625 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.415 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.179 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.424 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.597 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.508 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.508 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.508 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.304 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.552 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.690 2024-05-29 20:28:52,433 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 20:28:52,435 - mmdet - INFO - Epoch(val) [7][625] bbox_mAP: 0.4310, bbox_mAP_50: 0.6580, bbox_mAP_75: 0.4700, bbox_mAP_s: 0.2510, bbox_mAP_m: 0.4740, bbox_mAP_l: 0.5870, bbox_mAP_copypaste: 0.431 0.658 0.470 0.251 0.474 0.587, segm_mAP: 0.3920, segm_mAP_50: 0.6250, segm_mAP_75: 0.4150, segm_mAP_s: 0.1790, segm_mAP_m: 0.4240, segm_mAP_l: 0.5970, segm_mAP_copypaste: 0.392 0.625 0.415 0.179 0.424 0.597 2024-05-29 20:29:32,458 - mmdet - INFO - Epoch [8][50/7330] lr: 1.000e-04, eta: 7:01:05, time: 0.800, data_time: 0.118, memory: 11628, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0432, loss_cls: 0.1737, acc: 93.4900, loss_bbox: 0.2302, loss_mask: 0.2310, loss: 0.6943 2024-05-29 20:30:06,526 - mmdet - INFO - Epoch [8][100/7330] lr: 1.000e-04, eta: 7:00:30, time: 0.681, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0491, loss_cls: 0.1847, acc: 93.1602, loss_bbox: 0.2403, loss_mask: 0.2366, loss: 0.7297 2024-05-29 20:30:40,190 - mmdet - INFO - Epoch [8][150/7330] lr: 1.000e-04, eta: 6:59:55, time: 0.673, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0433, loss_cls: 0.1720, acc: 93.5945, loss_bbox: 0.2253, loss_mask: 0.2307, loss: 0.6887 2024-05-29 20:31:13,901 - mmdet - INFO - Epoch [8][200/7330] lr: 1.000e-04, eta: 6:59:20, time: 0.674, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0447, loss_cls: 0.1761, acc: 93.5588, loss_bbox: 0.2230, loss_mask: 0.2317, loss: 0.6924 2024-05-29 20:31:47,655 - mmdet - INFO - Epoch [8][250/7330] lr: 1.000e-04, eta: 6:58:45, time: 0.675, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0464, loss_cls: 0.1805, acc: 93.2742, loss_bbox: 0.2299, loss_mask: 0.2374, loss: 0.7117 2024-05-29 20:32:21,210 - mmdet - INFO - Epoch [8][300/7330] lr: 1.000e-04, eta: 6:58:10, time: 0.671, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0443, loss_cls: 0.1769, acc: 93.3689, loss_bbox: 0.2280, loss_mask: 0.2295, loss: 0.6960 2024-05-29 20:32:56,871 - mmdet - INFO - Epoch [8][350/7330] lr: 1.000e-04, eta: 6:57:36, time: 0.713, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0442, loss_cls: 0.1769, acc: 93.5173, loss_bbox: 0.2234, loss_mask: 0.2236, loss: 0.6843 2024-05-29 20:33:30,303 - mmdet - INFO - Epoch [8][400/7330] lr: 1.000e-04, eta: 6:57:01, time: 0.669, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0439, loss_cls: 0.1814, acc: 93.3403, loss_bbox: 0.2285, loss_mask: 0.2376, loss: 0.7088 2024-05-29 20:34:04,003 - mmdet - INFO - Epoch [8][450/7330] lr: 1.000e-04, eta: 6:56:25, time: 0.674, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0457, loss_cls: 0.1755, acc: 93.5132, loss_bbox: 0.2290, loss_mask: 0.2299, loss: 0.6981 2024-05-29 20:34:37,760 - mmdet - INFO - Epoch [8][500/7330] lr: 1.000e-04, eta: 6:55:50, time: 0.675, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0434, loss_cls: 0.1800, acc: 93.3455, loss_bbox: 0.2268, loss_mask: 0.2291, loss: 0.6955 2024-05-29 20:35:11,378 - mmdet - INFO - Epoch [8][550/7330] lr: 1.000e-04, eta: 6:55:15, time: 0.672, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0434, loss_cls: 0.1790, acc: 93.5178, loss_bbox: 0.2304, loss_mask: 0.2278, loss: 0.6984 2024-05-29 20:35:44,496 - mmdet - INFO - Epoch [8][600/7330] lr: 1.000e-04, eta: 6:54:40, time: 0.662, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0408, loss_cls: 0.1702, acc: 93.7217, loss_bbox: 0.2189, loss_mask: 0.2247, loss: 0.6717 2024-05-29 20:36:21,029 - mmdet - INFO - Epoch [8][650/7330] lr: 1.000e-04, eta: 6:54:07, time: 0.731, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0433, loss_cls: 0.1811, acc: 93.2903, loss_bbox: 0.2301, loss_mask: 0.2319, loss: 0.7050 2024-05-29 20:36:57,592 - mmdet - INFO - Epoch [8][700/7330] lr: 1.000e-04, eta: 6:53:34, time: 0.731, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0421, loss_cls: 0.1671, acc: 93.8542, loss_bbox: 0.2129, loss_mask: 0.2245, loss: 0.6631 2024-05-29 20:37:30,982 - mmdet - INFO - Epoch [8][750/7330] lr: 1.000e-04, eta: 6:52:58, time: 0.668, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0442, loss_cls: 0.1725, acc: 93.5635, loss_bbox: 0.2248, loss_mask: 0.2291, loss: 0.6879 2024-05-29 20:38:07,363 - mmdet - INFO - Epoch [8][800/7330] lr: 1.000e-04, eta: 6:52:25, time: 0.728, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0429, loss_cls: 0.1721, acc: 93.6082, loss_bbox: 0.2189, loss_mask: 0.2286, loss: 0.6776 2024-05-29 20:38:41,020 - mmdet - INFO - Epoch [8][850/7330] lr: 1.000e-04, eta: 6:51:50, time: 0.673, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0462, loss_cls: 0.1779, acc: 93.4885, loss_bbox: 0.2269, loss_mask: 0.2322, loss: 0.7001 2024-05-29 20:39:14,927 - mmdet - INFO - Epoch [8][900/7330] lr: 1.000e-04, eta: 6:51:15, time: 0.678, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0439, loss_cls: 0.1803, acc: 93.3923, loss_bbox: 0.2273, loss_mask: 0.2338, loss: 0.7018 2024-05-29 20:39:53,176 - mmdet - INFO - Epoch [8][950/7330] lr: 1.000e-04, eta: 6:50:43, time: 0.765, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0442, loss_cls: 0.1831, acc: 93.2195, loss_bbox: 0.2316, loss_mask: 0.2277, loss: 0.7042 2024-05-29 20:40:29,000 - mmdet - INFO - Epoch [8][1000/7330] lr: 1.000e-04, eta: 6:50:10, time: 0.716, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0435, loss_cls: 0.1799, acc: 93.3008, loss_bbox: 0.2337, loss_mask: 0.2333, loss: 0.7072 2024-05-29 20:41:02,172 - mmdet - INFO - Epoch [8][1050/7330] lr: 1.000e-04, eta: 6:49:34, time: 0.664, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0410, loss_cls: 0.1715, acc: 93.6980, loss_bbox: 0.2200, loss_mask: 0.2224, loss: 0.6713 2024-05-29 20:41:35,646 - mmdet - INFO - Epoch [8][1100/7330] lr: 1.000e-04, eta: 6:48:59, time: 0.669, data_time: 0.034, memory: 11628, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0399, loss_cls: 0.1639, acc: 93.9231, loss_bbox: 0.2093, loss_mask: 0.2173, loss: 0.6455 2024-05-29 20:42:08,982 - mmdet - INFO - Epoch [8][1150/7330] lr: 1.000e-04, eta: 6:48:24, time: 0.667, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0441, loss_cls: 0.1763, acc: 93.3899, loss_bbox: 0.2251, loss_mask: 0.2328, loss: 0.6951 2024-05-29 20:42:44,748 - mmdet - INFO - Epoch [8][1200/7330] lr: 1.000e-04, eta: 6:47:50, time: 0.715, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0434, loss_cls: 0.1780, acc: 93.3577, loss_bbox: 0.2332, loss_mask: 0.2313, loss: 0.7047 2024-05-29 20:43:20,425 - mmdet - INFO - Epoch [8][1250/7330] lr: 1.000e-04, eta: 6:47:16, time: 0.713, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0437, loss_cls: 0.1707, acc: 93.6594, loss_bbox: 0.2195, loss_mask: 0.2297, loss: 0.6818 2024-05-29 20:43:53,462 - mmdet - INFO - Epoch [8][1300/7330] lr: 1.000e-04, eta: 6:46:41, time: 0.661, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0424, loss_cls: 0.1710, acc: 93.6885, loss_bbox: 0.2146, loss_mask: 0.2262, loss: 0.6708 2024-05-29 20:44:27,076 - mmdet - INFO - Epoch [8][1350/7330] lr: 1.000e-04, eta: 6:46:06, time: 0.672, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0430, loss_cls: 0.1772, acc: 93.3999, loss_bbox: 0.2320, loss_mask: 0.2315, loss: 0.7005 2024-05-29 20:45:00,695 - mmdet - INFO - Epoch [8][1400/7330] lr: 1.000e-04, eta: 6:45:30, time: 0.672, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0420, loss_cls: 0.1760, acc: 93.4500, loss_bbox: 0.2242, loss_mask: 0.2277, loss: 0.6863 2024-05-29 20:45:34,719 - mmdet - INFO - Epoch [8][1450/7330] lr: 1.000e-04, eta: 6:44:56, time: 0.681, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0455, loss_cls: 0.1864, acc: 93.0830, loss_bbox: 0.2389, loss_mask: 0.2403, loss: 0.7297 2024-05-29 20:46:08,166 - mmdet - INFO - Epoch [8][1500/7330] lr: 1.000e-04, eta: 6:44:20, time: 0.669, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0429, loss_cls: 0.1778, acc: 93.4536, loss_bbox: 0.2281, loss_mask: 0.2332, loss: 0.6996 2024-05-29 20:46:43,563 - mmdet - INFO - Epoch [8][1550/7330] lr: 1.000e-04, eta: 6:43:46, time: 0.708, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0444, loss_cls: 0.1808, acc: 93.2810, loss_bbox: 0.2308, loss_mask: 0.2274, loss: 0.7009 2024-05-29 20:47:20,726 - mmdet - INFO - Epoch [8][1600/7330] lr: 1.000e-04, eta: 6:43:14, time: 0.744, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0465, loss_cls: 0.1789, acc: 93.3838, loss_bbox: 0.2287, loss_mask: 0.2315, loss: 0.7043 2024-05-29 20:47:53,865 - mmdet - INFO - Epoch [8][1650/7330] lr: 1.000e-04, eta: 6:42:38, time: 0.663, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0415, loss_cls: 0.1721, acc: 93.6372, loss_bbox: 0.2201, loss_mask: 0.2281, loss: 0.6782 2024-05-29 20:48:31,729 - mmdet - INFO - Epoch [8][1700/7330] lr: 1.000e-04, eta: 6:42:06, time: 0.757, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0418, loss_cls: 0.1709, acc: 93.7449, loss_bbox: 0.2226, loss_mask: 0.2246, loss: 0.6758 2024-05-29 20:49:05,301 - mmdet - INFO - Epoch [8][1750/7330] lr: 1.000e-04, eta: 6:41:31, time: 0.671, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0415, loss_cls: 0.1732, acc: 93.6296, loss_bbox: 0.2168, loss_mask: 0.2264, loss: 0.6730 2024-05-29 20:49:39,135 - mmdet - INFO - Epoch [8][1800/7330] lr: 1.000e-04, eta: 6:40:56, time: 0.677, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0422, loss_cls: 0.1762, acc: 93.5942, loss_bbox: 0.2225, loss_mask: 0.2259, loss: 0.6834 2024-05-29 20:50:17,602 - mmdet - INFO - Epoch [8][1850/7330] lr: 1.000e-04, eta: 6:40:24, time: 0.769, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0450, loss_cls: 0.1814, acc: 93.2844, loss_bbox: 0.2294, loss_mask: 0.2322, loss: 0.7059 2024-05-29 20:50:53,720 - mmdet - INFO - Epoch [8][1900/7330] lr: 1.000e-04, eta: 6:39:50, time: 0.722, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0437, loss_cls: 0.1761, acc: 93.4956, loss_bbox: 0.2234, loss_mask: 0.2308, loss: 0.6921 2024-05-29 20:51:27,895 - mmdet - INFO - Epoch [8][1950/7330] lr: 1.000e-04, eta: 6:39:16, time: 0.683, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0449, loss_cls: 0.1808, acc: 93.3643, loss_bbox: 0.2343, loss_mask: 0.2333, loss: 0.7120 2024-05-29 20:52:01,419 - mmdet - INFO - Epoch [8][2000/7330] lr: 1.000e-04, eta: 6:38:41, time: 0.670, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0426, loss_cls: 0.1757, acc: 93.4968, loss_bbox: 0.2252, loss_mask: 0.2284, loss: 0.6893 2024-05-29 20:52:34,871 - mmdet - INFO - Epoch [8][2050/7330] lr: 1.000e-04, eta: 6:38:05, time: 0.669, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0442, loss_cls: 0.1816, acc: 93.2673, loss_bbox: 0.2261, loss_mask: 0.2301, loss: 0.6988 2024-05-29 20:53:12,786 - mmdet - INFO - Epoch [8][2100/7330] lr: 1.000e-04, eta: 6:37:33, time: 0.759, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0420, loss_cls: 0.1749, acc: 93.4885, loss_bbox: 0.2230, loss_mask: 0.2256, loss: 0.6818 2024-05-29 20:53:46,491 - mmdet - INFO - Epoch [8][2150/7330] lr: 1.000e-04, eta: 6:36:58, time: 0.674, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0450, loss_cls: 0.1843, acc: 93.2668, loss_bbox: 0.2309, loss_mask: 0.2325, loss: 0.7108 2024-05-29 20:54:19,731 - mmdet - INFO - Epoch [8][2200/7330] lr: 1.000e-04, eta: 6:36:23, time: 0.665, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0437, loss_cls: 0.1823, acc: 93.3728, loss_bbox: 0.2335, loss_mask: 0.2337, loss: 0.7089 2024-05-29 20:54:52,841 - mmdet - INFO - Epoch [8][2250/7330] lr: 1.000e-04, eta: 6:35:47, time: 0.662, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0394, loss_cls: 0.1754, acc: 93.5315, loss_bbox: 0.2242, loss_mask: 0.2252, loss: 0.6835 2024-05-29 20:55:25,764 - mmdet - INFO - Epoch [8][2300/7330] lr: 1.000e-04, eta: 6:35:12, time: 0.659, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0432, loss_cls: 0.1746, acc: 93.6365, loss_bbox: 0.2224, loss_mask: 0.2315, loss: 0.6886 2024-05-29 20:55:58,860 - mmdet - INFO - Epoch [8][2350/7330] lr: 1.000e-04, eta: 6:34:36, time: 0.662, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0436, loss_cls: 0.1773, acc: 93.4021, loss_bbox: 0.2307, loss_mask: 0.2278, loss: 0.6949 2024-05-29 20:56:32,085 - mmdet - INFO - Epoch [8][2400/7330] lr: 1.000e-04, eta: 6:34:01, time: 0.665, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0445, loss_cls: 0.1788, acc: 93.4033, loss_bbox: 0.2182, loss_mask: 0.2302, loss: 0.6885 2024-05-29 20:57:07,166 - mmdet - INFO - Epoch [8][2450/7330] lr: 1.000e-04, eta: 6:33:27, time: 0.702, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0455, loss_cls: 0.1792, acc: 93.3794, loss_bbox: 0.2249, loss_mask: 0.2295, loss: 0.6962 2024-05-29 20:57:43,749 - mmdet - INFO - Epoch [8][2500/7330] lr: 1.000e-04, eta: 6:32:54, time: 0.732, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0410, loss_cls: 0.1753, acc: 93.5491, loss_bbox: 0.2237, loss_mask: 0.2269, loss: 0.6832 2024-05-29 20:58:16,885 - mmdet - INFO - Epoch [8][2550/7330] lr: 1.000e-04, eta: 6:32:18, time: 0.663, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0411, loss_cls: 0.1703, acc: 93.7126, loss_bbox: 0.2175, loss_mask: 0.2306, loss: 0.6779 2024-05-29 20:58:54,136 - mmdet - INFO - Epoch [8][2600/7330] lr: 1.000e-04, eta: 6:31:45, time: 0.745, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0463, loss_cls: 0.1783, acc: 93.4055, loss_bbox: 0.2283, loss_mask: 0.2297, loss: 0.6999 2024-05-29 20:59:27,311 - mmdet - INFO - Epoch [8][2650/7330] lr: 1.000e-04, eta: 6:31:10, time: 0.663, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0412, loss_cls: 0.1697, acc: 93.7144, loss_bbox: 0.2186, loss_mask: 0.2263, loss: 0.6723 2024-05-29 21:00:04,899 - mmdet - INFO - Epoch [8][2700/7330] lr: 1.000e-04, eta: 6:30:37, time: 0.752, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0410, loss_cls: 0.1701, acc: 93.7744, loss_bbox: 0.2144, loss_mask: 0.2256, loss: 0.6682 2024-05-29 21:00:40,953 - mmdet - INFO - Epoch [8][2750/7330] lr: 1.000e-04, eta: 6:30:04, time: 0.721, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0453, loss_cls: 0.1784, acc: 93.3613, loss_bbox: 0.2322, loss_mask: 0.2310, loss: 0.7041 2024-05-29 21:01:14,541 - mmdet - INFO - Epoch [8][2800/7330] lr: 1.000e-04, eta: 6:29:29, time: 0.672, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0431, loss_cls: 0.1807, acc: 93.4055, loss_bbox: 0.2235, loss_mask: 0.2253, loss: 0.6904 2024-05-29 21:01:48,295 - mmdet - INFO - Epoch [8][2850/7330] lr: 1.000e-04, eta: 6:28:54, time: 0.675, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0424, loss_cls: 0.1706, acc: 93.7942, loss_bbox: 0.2168, loss_mask: 0.2296, loss: 0.6770 2024-05-29 21:02:21,517 - mmdet - INFO - Epoch [8][2900/7330] lr: 1.000e-04, eta: 6:28:18, time: 0.664, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0424, loss_cls: 0.1732, acc: 93.6562, loss_bbox: 0.2224, loss_mask: 0.2261, loss: 0.6818 2024-05-29 21:02:54,707 - mmdet - INFO - Epoch [8][2950/7330] lr: 1.000e-04, eta: 6:27:43, time: 0.664, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0425, loss_cls: 0.1799, acc: 93.4697, loss_bbox: 0.2276, loss_mask: 0.2254, loss: 0.6921 2024-05-29 21:03:32,744 - mmdet - INFO - Epoch [8][3000/7330] lr: 1.000e-04, eta: 6:27:11, time: 0.761, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0438, loss_cls: 0.1816, acc: 93.3254, loss_bbox: 0.2278, loss_mask: 0.2339, loss: 0.7048 2024-05-29 21:04:06,544 - mmdet - INFO - Epoch [8][3050/7330] lr: 1.000e-04, eta: 6:26:36, time: 0.676, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0434, loss_cls: 0.1780, acc: 93.4639, loss_bbox: 0.2265, loss_mask: 0.2356, loss: 0.7021 2024-05-29 21:04:39,606 - mmdet - INFO - Epoch [8][3100/7330] lr: 1.000e-04, eta: 6:26:00, time: 0.661, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0421, loss_cls: 0.1739, acc: 93.5654, loss_bbox: 0.2234, loss_mask: 0.2307, loss: 0.6865 2024-05-29 21:05:12,869 - mmdet - INFO - Epoch [8][3150/7330] lr: 1.000e-04, eta: 6:25:25, time: 0.665, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0445, loss_cls: 0.1779, acc: 93.4382, loss_bbox: 0.2277, loss_mask: 0.2291, loss: 0.6966 2024-05-29 21:05:46,869 - mmdet - INFO - Epoch [8][3200/7330] lr: 1.000e-04, eta: 6:24:50, time: 0.680, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0443, loss_cls: 0.1782, acc: 93.4546, loss_bbox: 0.2345, loss_mask: 0.2278, loss: 0.7015 2024-05-29 21:06:20,259 - mmdet - INFO - Epoch [8][3250/7330] lr: 1.000e-04, eta: 6:24:15, time: 0.668, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0432, loss_cls: 0.1794, acc: 93.3325, loss_bbox: 0.2270, loss_mask: 0.2335, loss: 0.7007 2024-05-29 21:06:53,816 - mmdet - INFO - Epoch [8][3300/7330] lr: 1.000e-04, eta: 6:23:40, time: 0.671, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0452, loss_cls: 0.1804, acc: 93.3774, loss_bbox: 0.2285, loss_mask: 0.2308, loss: 0.7031 2024-05-29 21:07:29,859 - mmdet - INFO - Epoch [8][3350/7330] lr: 1.000e-04, eta: 6:23:06, time: 0.721, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0433, loss_cls: 0.1761, acc: 93.5222, loss_bbox: 0.2233, loss_mask: 0.2265, loss: 0.6870 2024-05-29 21:08:06,228 - mmdet - INFO - Epoch [8][3400/7330] lr: 1.000e-04, eta: 6:22:33, time: 0.727, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0433, loss_cls: 0.1748, acc: 93.6057, loss_bbox: 0.2240, loss_mask: 0.2359, loss: 0.6937 2024-05-29 21:08:42,118 - mmdet - INFO - Epoch [8][3450/7330] lr: 1.000e-04, eta: 6:21:59, time: 0.718, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0432, loss_cls: 0.1778, acc: 93.5100, loss_bbox: 0.2234, loss_mask: 0.2330, loss: 0.6945 2024-05-29 21:09:16,287 - mmdet - INFO - Epoch [8][3500/7330] lr: 1.000e-04, eta: 6:21:25, time: 0.683, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0443, loss_cls: 0.1801, acc: 93.4329, loss_bbox: 0.2260, loss_mask: 0.2352, loss: 0.7020 2024-05-29 21:09:49,975 - mmdet - INFO - Epoch [8][3550/7330] lr: 1.000e-04, eta: 6:20:50, time: 0.674, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0417, loss_cls: 0.1819, acc: 93.1655, loss_bbox: 0.2317, loss_mask: 0.2291, loss: 0.7006 2024-05-29 21:10:27,299 - mmdet - INFO - Epoch [8][3600/7330] lr: 1.000e-04, eta: 6:20:17, time: 0.747, data_time: 0.064, memory: 11628, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0463, loss_cls: 0.1837, acc: 93.2681, loss_bbox: 0.2300, loss_mask: 0.2340, loss: 0.7108 2024-05-29 21:11:03,036 - mmdet - INFO - Epoch [8][3650/7330] lr: 1.000e-04, eta: 6:19:43, time: 0.715, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0416, loss_cls: 0.1821, acc: 93.2869, loss_bbox: 0.2241, loss_mask: 0.2307, loss: 0.6951 2024-05-29 21:11:36,016 - mmdet - INFO - Epoch [8][3700/7330] lr: 1.000e-04, eta: 6:19:08, time: 0.660, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0425, loss_cls: 0.1796, acc: 93.3997, loss_bbox: 0.2241, loss_mask: 0.2297, loss: 0.6920 2024-05-29 21:12:09,260 - mmdet - INFO - Epoch [8][3750/7330] lr: 1.000e-04, eta: 6:18:32, time: 0.665, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0434, loss_cls: 0.1792, acc: 93.3940, loss_bbox: 0.2262, loss_mask: 0.2275, loss: 0.6940 2024-05-29 21:12:42,245 - mmdet - INFO - Epoch [8][3800/7330] lr: 1.000e-04, eta: 6:17:57, time: 0.660, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0445, loss_cls: 0.1820, acc: 93.2451, loss_bbox: 0.2307, loss_mask: 0.2346, loss: 0.7092 2024-05-29 21:13:15,158 - mmdet - INFO - Epoch [8][3850/7330] lr: 1.000e-04, eta: 6:17:21, time: 0.658, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0437, loss_cls: 0.1713, acc: 93.6851, loss_bbox: 0.2160, loss_mask: 0.2294, loss: 0.6766 2024-05-29 21:13:53,929 - mmdet - INFO - Epoch [8][3900/7330] lr: 1.000e-04, eta: 6:16:49, time: 0.775, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0425, loss_cls: 0.1774, acc: 93.3914, loss_bbox: 0.2295, loss_mask: 0.2354, loss: 0.7013 2024-05-29 21:14:27,204 - mmdet - INFO - Epoch [8][3950/7330] lr: 1.000e-04, eta: 6:16:14, time: 0.666, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0407, loss_cls: 0.1759, acc: 93.5371, loss_bbox: 0.2244, loss_mask: 0.2289, loss: 0.6855 2024-05-29 21:15:00,818 - mmdet - INFO - Epoch [8][4000/7330] lr: 1.000e-04, eta: 6:15:39, time: 0.672, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0447, loss_cls: 0.1817, acc: 93.3059, loss_bbox: 0.2276, loss_mask: 0.2294, loss: 0.7013 2024-05-29 21:15:34,062 - mmdet - INFO - Epoch [8][4050/7330] lr: 1.000e-04, eta: 6:15:04, time: 0.665, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0409, loss_cls: 0.1744, acc: 93.5632, loss_bbox: 0.2209, loss_mask: 0.2317, loss: 0.6850 2024-05-29 21:16:07,553 - mmdet - INFO - Epoch [8][4100/7330] lr: 1.000e-04, eta: 6:14:29, time: 0.670, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0454, loss_cls: 0.1836, acc: 93.2927, loss_bbox: 0.2347, loss_mask: 0.2338, loss: 0.7162 2024-05-29 21:16:40,941 - mmdet - INFO - Epoch [8][4150/7330] lr: 1.000e-04, eta: 6:13:53, time: 0.668, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0435, loss_cls: 0.1787, acc: 93.5649, loss_bbox: 0.2221, loss_mask: 0.2312, loss: 0.6936 2024-05-29 21:17:14,858 - mmdet - INFO - Epoch [8][4200/7330] lr: 1.000e-04, eta: 6:13:19, time: 0.678, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0437, loss_cls: 0.1799, acc: 93.4651, loss_bbox: 0.2237, loss_mask: 0.2254, loss: 0.6904 2024-05-29 21:17:52,974 - mmdet - INFO - Epoch [8][4250/7330] lr: 1.000e-04, eta: 6:12:46, time: 0.762, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0424, loss_cls: 0.1683, acc: 93.7605, loss_bbox: 0.2184, loss_mask: 0.2263, loss: 0.6733 2024-05-29 21:18:26,534 - mmdet - INFO - Epoch [8][4300/7330] lr: 1.000e-04, eta: 6:12:11, time: 0.671, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0432, loss_cls: 0.1710, acc: 93.7190, loss_bbox: 0.2202, loss_mask: 0.2303, loss: 0.6812 2024-05-29 21:19:02,181 - mmdet - INFO - Epoch [8][4350/7330] lr: 1.000e-04, eta: 6:11:37, time: 0.713, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0409, loss_cls: 0.1706, acc: 93.7881, loss_bbox: 0.2161, loss_mask: 0.2233, loss: 0.6666 2024-05-29 21:19:35,809 - mmdet - INFO - Epoch [8][4400/7330] lr: 1.000e-04, eta: 6:11:02, time: 0.673, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0401, loss_cls: 0.1742, acc: 93.6782, loss_bbox: 0.2210, loss_mask: 0.2290, loss: 0.6808 2024-05-29 21:20:09,147 - mmdet - INFO - Epoch [8][4450/7330] lr: 1.000e-04, eta: 6:10:27, time: 0.667, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0430, loss_cls: 0.1771, acc: 93.4399, loss_bbox: 0.2242, loss_mask: 0.2286, loss: 0.6890 2024-05-29 21:20:47,252 - mmdet - INFO - Epoch [8][4500/7330] lr: 1.000e-04, eta: 6:09:55, time: 0.762, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0450, loss_cls: 0.1799, acc: 93.3533, loss_bbox: 0.2265, loss_mask: 0.2282, loss: 0.6987 2024-05-29 21:21:22,988 - mmdet - INFO - Epoch [8][4550/7330] lr: 1.000e-04, eta: 6:09:21, time: 0.715, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0438, loss_cls: 0.1818, acc: 93.3286, loss_bbox: 0.2252, loss_mask: 0.2311, loss: 0.6985 2024-05-29 21:21:56,687 - mmdet - INFO - Epoch [8][4600/7330] lr: 1.000e-04, eta: 6:08:46, time: 0.674, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0426, loss_cls: 0.1752, acc: 93.4717, loss_bbox: 0.2218, loss_mask: 0.2320, loss: 0.6894 2024-05-29 21:22:30,307 - mmdet - INFO - Epoch [8][4650/7330] lr: 1.000e-04, eta: 6:08:11, time: 0.672, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0420, loss_cls: 0.1744, acc: 93.6025, loss_bbox: 0.2218, loss_mask: 0.2261, loss: 0.6802 2024-05-29 21:23:03,634 - mmdet - INFO - Epoch [8][4700/7330] lr: 1.000e-04, eta: 6:07:36, time: 0.667, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0402, loss_cls: 0.1731, acc: 93.6187, loss_bbox: 0.2166, loss_mask: 0.2315, loss: 0.6771 2024-05-29 21:23:39,399 - mmdet - INFO - Epoch [8][4750/7330] lr: 1.000e-04, eta: 6:07:02, time: 0.715, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0435, loss_cls: 0.1701, acc: 93.7903, loss_bbox: 0.2181, loss_mask: 0.2283, loss: 0.6767 2024-05-29 21:24:14,914 - mmdet - INFO - Epoch [8][4800/7330] lr: 1.000e-04, eta: 6:06:28, time: 0.710, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0433, loss_cls: 0.1773, acc: 93.4949, loss_bbox: 0.2277, loss_mask: 0.2354, loss: 0.7012 2024-05-29 21:24:48,374 - mmdet - INFO - Epoch [8][4850/7330] lr: 1.000e-04, eta: 6:05:53, time: 0.669, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0465, loss_cls: 0.1853, acc: 93.1575, loss_bbox: 0.2350, loss_mask: 0.2367, loss: 0.7218 2024-05-29 21:25:22,012 - mmdet - INFO - Epoch [8][4900/7330] lr: 1.000e-04, eta: 6:05:18, time: 0.673, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0451, loss_cls: 0.1809, acc: 93.3213, loss_bbox: 0.2270, loss_mask: 0.2331, loss: 0.7024 2024-05-29 21:25:55,694 - mmdet - INFO - Epoch [8][4950/7330] lr: 1.000e-04, eta: 6:04:43, time: 0.674, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0429, loss_cls: 0.1742, acc: 93.6177, loss_bbox: 0.2214, loss_mask: 0.2300, loss: 0.6853 2024-05-29 21:26:29,549 - mmdet - INFO - Epoch [8][5000/7330] lr: 1.000e-04, eta: 6:04:08, time: 0.677, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0447, loss_cls: 0.1806, acc: 93.3657, loss_bbox: 0.2280, loss_mask: 0.2282, loss: 0.6984 2024-05-29 21:27:02,953 - mmdet - INFO - Epoch [8][5050/7330] lr: 1.000e-04, eta: 6:03:33, time: 0.668, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0442, loss_cls: 0.1800, acc: 93.2769, loss_bbox: 0.2270, loss_mask: 0.2268, loss: 0.6947 2024-05-29 21:27:38,782 - mmdet - INFO - Epoch [8][5100/7330] lr: 1.000e-04, eta: 6:02:59, time: 0.717, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0429, loss_cls: 0.1820, acc: 93.3818, loss_bbox: 0.2275, loss_mask: 0.2329, loss: 0.7018 2024-05-29 21:28:14,918 - mmdet - INFO - Epoch [8][5150/7330] lr: 1.000e-04, eta: 6:02:25, time: 0.723, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0468, loss_cls: 0.1837, acc: 93.2124, loss_bbox: 0.2310, loss_mask: 0.2339, loss: 0.7137 2024-05-29 21:28:48,732 - mmdet - INFO - Epoch [8][5200/7330] lr: 1.000e-04, eta: 6:01:50, time: 0.676, data_time: 0.066, memory: 11628, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0436, loss_cls: 0.1781, acc: 93.4805, loss_bbox: 0.2260, loss_mask: 0.2327, loss: 0.6991 2024-05-29 21:29:25,014 - mmdet - INFO - Epoch [8][5250/7330] lr: 1.000e-04, eta: 6:01:17, time: 0.726, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0448, loss_cls: 0.1736, acc: 93.6367, loss_bbox: 0.2245, loss_mask: 0.2336, loss: 0.6931 2024-05-29 21:29:58,690 - mmdet - INFO - Epoch [8][5300/7330] lr: 1.000e-04, eta: 6:00:42, time: 0.673, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0440, loss_cls: 0.1817, acc: 93.4050, loss_bbox: 0.2272, loss_mask: 0.2311, loss: 0.7019 2024-05-29 21:30:31,983 - mmdet - INFO - Epoch [8][5350/7330] lr: 1.000e-04, eta: 6:00:07, time: 0.666, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0422, loss_cls: 0.1803, acc: 93.2988, loss_bbox: 0.2266, loss_mask: 0.2294, loss: 0.6958 2024-05-29 21:31:09,482 - mmdet - INFO - Epoch [8][5400/7330] lr: 1.000e-04, eta: 5:59:34, time: 0.750, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0453, loss_cls: 0.1776, acc: 93.4004, loss_bbox: 0.2271, loss_mask: 0.2379, loss: 0.7056 2024-05-29 21:31:44,769 - mmdet - INFO - Epoch [8][5450/7330] lr: 1.000e-04, eta: 5:59:00, time: 0.706, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0428, loss_cls: 0.1731, acc: 93.6414, loss_bbox: 0.2209, loss_mask: 0.2300, loss: 0.6848 2024-05-29 21:32:18,049 - mmdet - INFO - Epoch [8][5500/7330] lr: 1.000e-04, eta: 5:58:24, time: 0.666, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0427, loss_cls: 0.1730, acc: 93.6294, loss_bbox: 0.2179, loss_mask: 0.2256, loss: 0.6765 2024-05-29 21:32:51,448 - mmdet - INFO - Epoch [8][5550/7330] lr: 1.000e-04, eta: 5:57:49, time: 0.668, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0424, loss_cls: 0.1749, acc: 93.6428, loss_bbox: 0.2258, loss_mask: 0.2331, loss: 0.6936 2024-05-29 21:33:25,079 - mmdet - INFO - Epoch [8][5600/7330] lr: 1.000e-04, eta: 5:57:14, time: 0.673, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0447, loss_cls: 0.1813, acc: 93.3420, loss_bbox: 0.2296, loss_mask: 0.2282, loss: 0.7014 2024-05-29 21:34:00,793 - mmdet - INFO - Epoch [8][5650/7330] lr: 1.000e-04, eta: 5:56:40, time: 0.714, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0451, loss_cls: 0.1882, acc: 93.1372, loss_bbox: 0.2371, loss_mask: 0.2327, loss: 0.7209 2024-05-29 21:34:36,015 - mmdet - INFO - Epoch [8][5700/7330] lr: 1.000e-04, eta: 5:56:06, time: 0.704, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0412, loss_cls: 0.1786, acc: 93.4341, loss_bbox: 0.2228, loss_mask: 0.2300, loss: 0.6893 2024-05-29 21:35:09,758 - mmdet - INFO - Epoch [8][5750/7330] lr: 1.000e-04, eta: 5:55:31, time: 0.675, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0456, loss_cls: 0.1784, acc: 93.4495, loss_bbox: 0.2236, loss_mask: 0.2261, loss: 0.6911 2024-05-29 21:35:43,164 - mmdet - INFO - Epoch [8][5800/7330] lr: 1.000e-04, eta: 5:54:56, time: 0.668, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0419, loss_cls: 0.1673, acc: 93.9014, loss_bbox: 0.2115, loss_mask: 0.2246, loss: 0.6609 2024-05-29 21:36:16,418 - mmdet - INFO - Epoch [8][5850/7330] lr: 1.000e-04, eta: 5:54:21, time: 0.665, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0452, loss_cls: 0.1761, acc: 93.5154, loss_bbox: 0.2250, loss_mask: 0.2345, loss: 0.6986 2024-05-29 21:36:49,709 - mmdet - INFO - Epoch [8][5900/7330] lr: 1.000e-04, eta: 5:53:46, time: 0.666, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0434, loss_cls: 0.1725, acc: 93.6860, loss_bbox: 0.2184, loss_mask: 0.2324, loss: 0.6837 2024-05-29 21:37:23,014 - mmdet - INFO - Epoch [8][5950/7330] lr: 1.000e-04, eta: 5:53:11, time: 0.666, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0390, loss_cls: 0.1683, acc: 93.7834, loss_bbox: 0.2130, loss_mask: 0.2249, loss: 0.6602 2024-05-29 21:37:58,759 - mmdet - INFO - Epoch [8][6000/7330] lr: 1.000e-04, eta: 5:52:37, time: 0.715, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0435, loss_cls: 0.1738, acc: 93.5190, loss_bbox: 0.2202, loss_mask: 0.2240, loss: 0.6781 2024-05-29 21:38:34,907 - mmdet - INFO - Epoch [8][6050/7330] lr: 1.000e-04, eta: 5:52:03, time: 0.723, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0428, loss_cls: 0.1807, acc: 93.3889, loss_bbox: 0.2262, loss_mask: 0.2312, loss: 0.6983 2024-05-29 21:39:08,401 - mmdet - INFO - Epoch [8][6100/7330] lr: 1.000e-04, eta: 5:51:28, time: 0.670, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0439, loss_cls: 0.1764, acc: 93.5830, loss_bbox: 0.2217, loss_mask: 0.2279, loss: 0.6883 2024-05-29 21:39:43,944 - mmdet - INFO - Epoch [8][6150/7330] lr: 1.000e-04, eta: 5:50:54, time: 0.711, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0402, loss_cls: 0.1673, acc: 93.8604, loss_bbox: 0.2102, loss_mask: 0.2266, loss: 0.6613 2024-05-29 21:40:17,145 - mmdet - INFO - Epoch [8][6200/7330] lr: 1.000e-04, eta: 5:50:19, time: 0.664, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0406, loss_cls: 0.1759, acc: 93.5100, loss_bbox: 0.2196, loss_mask: 0.2291, loss: 0.6825 2024-05-29 21:40:55,071 - mmdet - INFO - Epoch [8][6250/7330] lr: 1.000e-04, eta: 5:49:46, time: 0.758, data_time: 0.064, memory: 11628, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0456, loss_cls: 0.1868, acc: 93.1472, loss_bbox: 0.2326, loss_mask: 0.2359, loss: 0.7194 2024-05-29 21:41:31,037 - mmdet - INFO - Epoch [8][6300/7330] lr: 1.000e-04, eta: 5:49:13, time: 0.720, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0438, loss_cls: 0.1833, acc: 93.3987, loss_bbox: 0.2243, loss_mask: 0.2307, loss: 0.6993 2024-05-29 21:42:04,854 - mmdet - INFO - Epoch [8][6350/7330] lr: 1.000e-04, eta: 5:48:38, time: 0.676, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0434, loss_cls: 0.1873, acc: 92.9695, loss_bbox: 0.2388, loss_mask: 0.2327, loss: 0.7197 2024-05-29 21:42:38,443 - mmdet - INFO - Epoch [8][6400/7330] lr: 1.000e-04, eta: 5:48:03, time: 0.672, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0407, loss_cls: 0.1719, acc: 93.7688, loss_bbox: 0.2145, loss_mask: 0.2254, loss: 0.6698 2024-05-29 21:43:11,778 - mmdet - INFO - Epoch [8][6450/7330] lr: 1.000e-04, eta: 5:47:27, time: 0.666, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0441, loss_cls: 0.1686, acc: 93.7358, loss_bbox: 0.2158, loss_mask: 0.2260, loss: 0.6732 2024-05-29 21:43:45,151 - mmdet - INFO - Epoch [8][6500/7330] lr: 1.000e-04, eta: 5:46:52, time: 0.668, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0435, loss_cls: 0.1855, acc: 93.0664, loss_bbox: 0.2318, loss_mask: 0.2303, loss: 0.7092 2024-05-29 21:44:24,326 - mmdet - INFO - Epoch [8][6550/7330] lr: 1.000e-04, eta: 5:46:20, time: 0.783, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0414, loss_cls: 0.1799, acc: 93.4543, loss_bbox: 0.2232, loss_mask: 0.2251, loss: 0.6853 2024-05-29 21:44:58,216 - mmdet - INFO - Epoch [8][6600/7330] lr: 1.000e-04, eta: 5:45:45, time: 0.678, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0443, loss_cls: 0.1765, acc: 93.5000, loss_bbox: 0.2255, loss_mask: 0.2291, loss: 0.6921 2024-05-29 21:45:31,484 - mmdet - INFO - Epoch [8][6650/7330] lr: 1.000e-04, eta: 5:45:10, time: 0.665, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0413, loss_cls: 0.1765, acc: 93.4937, loss_bbox: 0.2254, loss_mask: 0.2350, loss: 0.6942 2024-05-29 21:46:05,189 - mmdet - INFO - Epoch [8][6700/7330] lr: 1.000e-04, eta: 5:44:35, time: 0.674, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0412, loss_cls: 0.1739, acc: 93.6335, loss_bbox: 0.2182, loss_mask: 0.2290, loss: 0.6788 2024-05-29 21:46:38,411 - mmdet - INFO - Epoch [8][6750/7330] lr: 1.000e-04, eta: 5:44:00, time: 0.664, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0419, loss_cls: 0.1826, acc: 93.2993, loss_bbox: 0.2302, loss_mask: 0.2307, loss: 0.7016 2024-05-29 21:47:12,103 - mmdet - INFO - Epoch [8][6800/7330] lr: 1.000e-04, eta: 5:43:25, time: 0.674, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0430, loss_cls: 0.1805, acc: 93.4258, loss_bbox: 0.2283, loss_mask: 0.2340, loss: 0.7032 2024-05-29 21:47:45,244 - mmdet - INFO - Epoch [8][6850/7330] lr: 1.000e-04, eta: 5:42:50, time: 0.663, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0436, loss_cls: 0.1728, acc: 93.6604, loss_bbox: 0.2205, loss_mask: 0.2280, loss: 0.6813 2024-05-29 21:48:20,769 - mmdet - INFO - Epoch [8][6900/7330] lr: 1.000e-04, eta: 5:42:16, time: 0.710, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0420, loss_cls: 0.1651, acc: 94.0220, loss_bbox: 0.2133, loss_mask: 0.2286, loss: 0.6651 2024-05-29 21:48:56,967 - mmdet - INFO - Epoch [8][6950/7330] lr: 1.000e-04, eta: 5:41:42, time: 0.724, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0428, loss_cls: 0.1684, acc: 93.8259, loss_bbox: 0.2143, loss_mask: 0.2241, loss: 0.6660 2024-05-29 21:49:32,870 - mmdet - INFO - Epoch [8][7000/7330] lr: 1.000e-04, eta: 5:41:09, time: 0.718, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0428, loss_cls: 0.1756, acc: 93.5256, loss_bbox: 0.2229, loss_mask: 0.2242, loss: 0.6813 2024-05-29 21:50:06,791 - mmdet - INFO - Epoch [8][7050/7330] lr: 1.000e-04, eta: 5:40:34, time: 0.678, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0427, loss_cls: 0.1818, acc: 93.3582, loss_bbox: 0.2269, loss_mask: 0.2269, loss: 0.6969 2024-05-29 21:50:40,417 - mmdet - INFO - Epoch [8][7100/7330] lr: 1.000e-04, eta: 5:39:59, time: 0.673, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0452, loss_cls: 0.1811, acc: 93.3486, loss_bbox: 0.2299, loss_mask: 0.2355, loss: 0.7092 2024-05-29 21:51:18,422 - mmdet - INFO - Epoch [8][7150/7330] lr: 1.000e-04, eta: 5:39:26, time: 0.760, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0407, loss_cls: 0.1827, acc: 93.3970, loss_bbox: 0.2248, loss_mask: 0.2297, loss: 0.6943 2024-05-29 21:51:55,078 - mmdet - INFO - Epoch [8][7200/7330] lr: 1.000e-04, eta: 5:38:53, time: 0.733, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0470, loss_cls: 0.1839, acc: 93.1782, loss_bbox: 0.2359, loss_mask: 0.2396, loss: 0.7250 2024-05-29 21:52:28,599 - mmdet - INFO - Epoch [8][7250/7330] lr: 1.000e-04, eta: 5:38:18, time: 0.671, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0412, loss_cls: 0.1819, acc: 93.4087, loss_bbox: 0.2242, loss_mask: 0.2308, loss: 0.6965 2024-05-29 21:53:02,229 - mmdet - INFO - Epoch [8][7300/7330] lr: 1.000e-04, eta: 5:37:43, time: 0.672, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0446, loss_cls: 0.1801, acc: 93.3572, loss_bbox: 0.2298, loss_mask: 0.2253, loss: 0.6966 2024-05-29 21:53:23,115 - mmdet - INFO - Saving checkpoint at 8 epochs 2024-05-29 21:55:16,731 - mmdet - INFO - Evaluating bbox... 2024-05-29 21:55:38,193 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.432 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.658 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.472 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.268 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.471 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.586 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.555 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.555 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.555 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.372 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.602 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.726 2024-05-29 21:55:38,194 - mmdet - INFO - Evaluating segm... 2024-05-29 21:56:05,474 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.391 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.623 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.418 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.188 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.427 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.605 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.506 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.506 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.506 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.304 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.554 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.696 2024-05-29 21:56:05,806 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 21:56:05,808 - mmdet - INFO - Epoch(val) [8][625] bbox_mAP: 0.4320, bbox_mAP_50: 0.6580, bbox_mAP_75: 0.4720, bbox_mAP_s: 0.2680, bbox_mAP_m: 0.4710, bbox_mAP_l: 0.5860, bbox_mAP_copypaste: 0.432 0.658 0.472 0.268 0.471 0.586, segm_mAP: 0.3910, segm_mAP_50: 0.6230, segm_mAP_75: 0.4180, segm_mAP_s: 0.1880, segm_mAP_m: 0.4270, segm_mAP_l: 0.6050, segm_mAP_copypaste: 0.391 0.623 0.418 0.188 0.427 0.605 2024-05-29 21:56:46,315 - mmdet - INFO - Epoch [9][50/7330] lr: 1.000e-05, eta: 5:36:40, time: 0.810, data_time: 0.118, memory: 11628, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0411, loss_cls: 0.1730, acc: 93.5474, loss_bbox: 0.2216, loss_mask: 0.2264, loss: 0.6766 2024-05-29 21:57:20,087 - mmdet - INFO - Epoch [9][100/7330] lr: 1.000e-05, eta: 5:36:05, time: 0.675, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0409, loss_cls: 0.1673, acc: 93.8213, loss_bbox: 0.2129, loss_mask: 0.2234, loss: 0.6599 2024-05-29 21:57:54,292 - mmdet - INFO - Epoch [9][150/7330] lr: 1.000e-05, eta: 5:35:30, time: 0.684, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0411, loss_cls: 0.1663, acc: 93.7700, loss_bbox: 0.2145, loss_mask: 0.2257, loss: 0.6627 2024-05-29 21:58:27,735 - mmdet - INFO - Epoch [9][200/7330] lr: 1.000e-05, eta: 5:34:55, time: 0.669, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0395, loss_cls: 0.1636, acc: 94.0039, loss_bbox: 0.2121, loss_mask: 0.2275, loss: 0.6580 2024-05-29 21:59:01,898 - mmdet - INFO - Epoch [9][250/7330] lr: 1.000e-05, eta: 5:34:21, time: 0.683, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0427, loss_cls: 0.1693, acc: 93.6152, loss_bbox: 0.2159, loss_mask: 0.2191, loss: 0.6628 2024-05-29 21:59:35,319 - mmdet - INFO - Epoch [9][300/7330] lr: 1.000e-05, eta: 5:33:46, time: 0.668, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0435, loss_cls: 0.1689, acc: 93.5828, loss_bbox: 0.2190, loss_mask: 0.2276, loss: 0.6751 2024-05-29 22:00:09,666 - mmdet - INFO - Epoch [9][350/7330] lr: 1.000e-05, eta: 5:33:11, time: 0.687, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0428, loss_cls: 0.1703, acc: 93.5730, loss_bbox: 0.2216, loss_mask: 0.2284, loss: 0.6786 2024-05-29 22:00:43,370 - mmdet - INFO - Epoch [9][400/7330] lr: 1.000e-05, eta: 5:32:36, time: 0.674, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0402, loss_cls: 0.1620, acc: 93.9146, loss_bbox: 0.2096, loss_mask: 0.2191, loss: 0.6455 2024-05-29 22:01:16,604 - mmdet - INFO - Epoch [9][450/7330] lr: 1.000e-05, eta: 5:32:01, time: 0.665, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0401, loss_cls: 0.1655, acc: 93.7651, loss_bbox: 0.2163, loss_mask: 0.2245, loss: 0.6610 2024-05-29 22:01:49,737 - mmdet - INFO - Epoch [9][500/7330] lr: 1.000e-05, eta: 5:31:26, time: 0.663, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0394, loss_cls: 0.1604, acc: 93.9346, loss_bbox: 0.2126, loss_mask: 0.2190, loss: 0.6448 2024-05-29 22:02:24,244 - mmdet - INFO - Epoch [9][550/7330] lr: 1.000e-05, eta: 5:30:51, time: 0.690, data_time: 0.065, memory: 11628, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0410, loss_cls: 0.1622, acc: 93.9055, loss_bbox: 0.2151, loss_mask: 0.2225, loss: 0.6559 2024-05-29 22:02:58,624 - mmdet - INFO - Epoch [9][600/7330] lr: 1.000e-05, eta: 5:30:17, time: 0.688, data_time: 0.063, memory: 11628, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0408, loss_cls: 0.1555, acc: 94.1526, loss_bbox: 0.2046, loss_mask: 0.2185, loss: 0.6341 2024-05-29 22:03:32,781 - mmdet - INFO - Epoch [9][650/7330] lr: 1.000e-05, eta: 5:29:42, time: 0.683, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0419, loss_cls: 0.1582, acc: 93.9937, loss_bbox: 0.2095, loss_mask: 0.2191, loss: 0.6430 2024-05-29 22:04:06,480 - mmdet - INFO - Epoch [9][700/7330] lr: 1.000e-05, eta: 5:29:07, time: 0.674, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0412, loss_cls: 0.1610, acc: 94.0056, loss_bbox: 0.2107, loss_mask: 0.2191, loss: 0.6462 2024-05-29 22:04:39,917 - mmdet - INFO - Epoch [9][750/7330] lr: 1.000e-05, eta: 5:28:32, time: 0.669, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0386, loss_cls: 0.1562, acc: 94.1831, loss_bbox: 0.2063, loss_mask: 0.2162, loss: 0.6308 2024-05-29 22:05:17,504 - mmdet - INFO - Epoch [9][800/7330] lr: 1.000e-05, eta: 5:27:59, time: 0.752, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0395, loss_cls: 0.1563, acc: 94.1292, loss_bbox: 0.2058, loss_mask: 0.2182, loss: 0.6338 2024-05-29 22:05:53,508 - mmdet - INFO - Epoch [9][850/7330] lr: 1.000e-05, eta: 5:27:25, time: 0.720, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0403, loss_cls: 0.1560, acc: 93.9883, loss_bbox: 0.2065, loss_mask: 0.2208, loss: 0.6370 2024-05-29 22:06:31,805 - mmdet - INFO - Epoch [9][900/7330] lr: 1.000e-05, eta: 5:26:53, time: 0.766, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0388, loss_cls: 0.1571, acc: 94.0808, loss_bbox: 0.2046, loss_mask: 0.2117, loss: 0.6257 2024-05-29 22:07:05,432 - mmdet - INFO - Epoch [9][950/7330] lr: 1.000e-05, eta: 5:26:18, time: 0.673, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0370, loss_cls: 0.1562, acc: 94.1484, loss_bbox: 0.2085, loss_mask: 0.2204, loss: 0.6349 2024-05-29 22:07:39,522 - mmdet - INFO - Epoch [9][1000/7330] lr: 1.000e-05, eta: 5:25:43, time: 0.682, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0404, loss_cls: 0.1568, acc: 94.1382, loss_bbox: 0.2080, loss_mask: 0.2173, loss: 0.6371 2024-05-29 22:08:15,115 - mmdet - INFO - Epoch [9][1050/7330] lr: 1.000e-05, eta: 5:25:09, time: 0.712, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0396, loss_cls: 0.1636, acc: 93.8125, loss_bbox: 0.2144, loss_mask: 0.2241, loss: 0.6572 2024-05-29 22:08:48,506 - mmdet - INFO - Epoch [9][1100/7330] lr: 1.000e-05, eta: 5:24:34, time: 0.668, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0403, loss_cls: 0.1546, acc: 94.1853, loss_bbox: 0.2018, loss_mask: 0.2147, loss: 0.6254 2024-05-29 22:09:28,830 - mmdet - INFO - Epoch [9][1150/7330] lr: 1.000e-05, eta: 5:24:02, time: 0.806, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0402, loss_cls: 0.1544, acc: 94.1072, loss_bbox: 0.2076, loss_mask: 0.2211, loss: 0.6370 2024-05-29 22:10:02,251 - mmdet - INFO - Epoch [9][1200/7330] lr: 1.000e-05, eta: 5:23:27, time: 0.668, data_time: 0.036, memory: 11628, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0397, loss_cls: 0.1569, acc: 94.0269, loss_bbox: 0.2101, loss_mask: 0.2233, loss: 0.6441 2024-05-29 22:10:35,434 - mmdet - INFO - Epoch [9][1250/7330] lr: 1.000e-05, eta: 5:22:52, time: 0.664, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0390, loss_cls: 0.1560, acc: 94.1978, loss_bbox: 0.2049, loss_mask: 0.2221, loss: 0.6362 2024-05-29 22:11:09,589 - mmdet - INFO - Epoch [9][1300/7330] lr: 1.000e-05, eta: 5:22:18, time: 0.683, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0402, loss_cls: 0.1593, acc: 93.9319, loss_bbox: 0.2096, loss_mask: 0.2180, loss: 0.6416 2024-05-29 22:11:43,617 - mmdet - INFO - Epoch [9][1350/7330] lr: 1.000e-05, eta: 5:21:43, time: 0.681, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0421, loss_cls: 0.1720, acc: 93.5286, loss_bbox: 0.2212, loss_mask: 0.2252, loss: 0.6763 2024-05-29 22:12:17,492 - mmdet - INFO - Epoch [9][1400/7330] lr: 1.000e-05, eta: 5:21:08, time: 0.677, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0422, loss_cls: 0.1601, acc: 94.0042, loss_bbox: 0.2110, loss_mask: 0.2170, loss: 0.6449 2024-05-29 22:12:51,008 - mmdet - INFO - Epoch [9][1450/7330] lr: 1.000e-05, eta: 5:20:33, time: 0.670, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0396, loss_cls: 0.1530, acc: 94.2527, loss_bbox: 0.2044, loss_mask: 0.2211, loss: 0.6322 2024-05-29 22:13:24,072 - mmdet - INFO - Epoch [9][1500/7330] lr: 1.000e-05, eta: 5:19:58, time: 0.661, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0403, loss_cls: 0.1621, acc: 93.8228, loss_bbox: 0.2174, loss_mask: 0.2246, loss: 0.6574 2024-05-29 22:13:57,440 - mmdet - INFO - Epoch [9][1550/7330] lr: 1.000e-05, eta: 5:19:23, time: 0.667, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0402, loss_cls: 0.1585, acc: 93.9500, loss_bbox: 0.2093, loss_mask: 0.2184, loss: 0.6403 2024-05-29 22:14:31,248 - mmdet - INFO - Epoch [9][1600/7330] lr: 1.000e-05, eta: 5:18:48, time: 0.676, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0416, loss_cls: 0.1614, acc: 93.8821, loss_bbox: 0.2145, loss_mask: 0.2188, loss: 0.6504 2024-05-29 22:15:08,354 - mmdet - INFO - Epoch [9][1650/7330] lr: 1.000e-05, eta: 5:18:15, time: 0.742, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0413, loss_cls: 0.1616, acc: 93.9421, loss_bbox: 0.2103, loss_mask: 0.2218, loss: 0.6492 2024-05-29 22:15:42,230 - mmdet - INFO - Epoch [9][1700/7330] lr: 1.000e-05, eta: 5:17:40, time: 0.677, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0395, loss_cls: 0.1671, acc: 93.7717, loss_bbox: 0.2157, loss_mask: 0.2190, loss: 0.6563 2024-05-29 22:16:22,102 - mmdet - INFO - Epoch [9][1750/7330] lr: 1.000e-05, eta: 5:17:08, time: 0.797, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0363, loss_cls: 0.1523, acc: 94.2837, loss_bbox: 0.2039, loss_mask: 0.2171, loss: 0.6224 2024-05-29 22:16:55,784 - mmdet - INFO - Epoch [9][1800/7330] lr: 1.000e-05, eta: 5:16:33, time: 0.674, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0398, loss_cls: 0.1576, acc: 94.1042, loss_bbox: 0.2073, loss_mask: 0.2188, loss: 0.6376 2024-05-29 22:17:29,445 - mmdet - INFO - Epoch [9][1850/7330] lr: 1.000e-05, eta: 5:15:58, time: 0.673, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0396, loss_cls: 0.1618, acc: 93.9287, loss_bbox: 0.2125, loss_mask: 0.2174, loss: 0.6450 2024-05-29 22:18:02,310 - mmdet - INFO - Epoch [9][1900/7330] lr: 1.000e-05, eta: 5:15:23, time: 0.657, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0379, loss_cls: 0.1505, acc: 94.4263, loss_bbox: 0.2001, loss_mask: 0.2104, loss: 0.6107 2024-05-29 22:18:37,600 - mmdet - INFO - Epoch [9][1950/7330] lr: 1.000e-05, eta: 5:14:49, time: 0.706, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0391, loss_cls: 0.1600, acc: 94.0300, loss_bbox: 0.2141, loss_mask: 0.2193, loss: 0.6455 2024-05-29 22:19:12,774 - mmdet - INFO - Epoch [9][2000/7330] lr: 1.000e-05, eta: 5:14:14, time: 0.703, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0393, loss_cls: 0.1567, acc: 93.9712, loss_bbox: 0.2065, loss_mask: 0.2135, loss: 0.6303 2024-05-29 22:19:50,329 - mmdet - INFO - Epoch [9][2050/7330] lr: 1.000e-05, eta: 5:13:41, time: 0.751, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0415, loss_cls: 0.1565, acc: 94.1511, loss_bbox: 0.2076, loss_mask: 0.2170, loss: 0.6365 2024-05-29 22:20:24,216 - mmdet - INFO - Epoch [9][2100/7330] lr: 1.000e-05, eta: 5:13:07, time: 0.678, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0397, loss_cls: 0.1516, acc: 94.2900, loss_bbox: 0.2040, loss_mask: 0.2155, loss: 0.6251 2024-05-29 22:20:57,874 - mmdet - INFO - Epoch [9][2150/7330] lr: 1.000e-05, eta: 5:12:32, time: 0.673, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0435, loss_cls: 0.1652, acc: 93.8010, loss_bbox: 0.2170, loss_mask: 0.2229, loss: 0.6641 2024-05-29 22:21:31,601 - mmdet - INFO - Epoch [9][2200/7330] lr: 1.000e-05, eta: 5:11:57, time: 0.675, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0430, loss_cls: 0.1651, acc: 93.7485, loss_bbox: 0.2159, loss_mask: 0.2284, loss: 0.6679 2024-05-29 22:22:04,732 - mmdet - INFO - Epoch [9][2250/7330] lr: 1.000e-05, eta: 5:11:22, time: 0.663, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0373, loss_cls: 0.1501, acc: 94.2710, loss_bbox: 0.2057, loss_mask: 0.2190, loss: 0.6250 2024-05-29 22:22:38,248 - mmdet - INFO - Epoch [9][2300/7330] lr: 1.000e-05, eta: 5:10:47, time: 0.670, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0394, loss_cls: 0.1568, acc: 94.0964, loss_bbox: 0.2083, loss_mask: 0.2209, loss: 0.6395 2024-05-29 22:23:12,420 - mmdet - INFO - Epoch [9][2350/7330] lr: 1.000e-05, eta: 5:10:12, time: 0.683, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0404, loss_cls: 0.1677, acc: 93.5979, loss_bbox: 0.2200, loss_mask: 0.2256, loss: 0.6688 2024-05-29 22:23:45,736 - mmdet - INFO - Epoch [9][2400/7330] lr: 1.000e-05, eta: 5:09:37, time: 0.666, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0413, loss_cls: 0.1600, acc: 93.9246, loss_bbox: 0.2126, loss_mask: 0.2198, loss: 0.6500 2024-05-29 22:24:19,147 - mmdet - INFO - Epoch [9][2450/7330] lr: 1.000e-05, eta: 5:09:02, time: 0.668, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0388, loss_cls: 0.1620, acc: 93.8384, loss_bbox: 0.2155, loss_mask: 0.2231, loss: 0.6527 2024-05-29 22:24:52,321 - mmdet - INFO - Epoch [9][2500/7330] lr: 1.000e-05, eta: 5:08:27, time: 0.663, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0409, loss_cls: 0.1593, acc: 93.9944, loss_bbox: 0.2091, loss_mask: 0.2190, loss: 0.6435 2024-05-29 22:25:28,272 - mmdet - INFO - Epoch [9][2550/7330] lr: 1.000e-05, eta: 5:07:53, time: 0.719, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0391, loss_cls: 0.1532, acc: 94.2185, loss_bbox: 0.2040, loss_mask: 0.2148, loss: 0.6255 2024-05-29 22:26:03,899 - mmdet - INFO - Epoch [9][2600/7330] lr: 1.000e-05, eta: 5:07:19, time: 0.712, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0380, loss_cls: 0.1512, acc: 94.2480, loss_bbox: 0.2007, loss_mask: 0.2174, loss: 0.6213 2024-05-29 22:26:41,917 - mmdet - INFO - Epoch [9][2650/7330] lr: 1.000e-05, eta: 5:06:46, time: 0.760, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0383, loss_cls: 0.1560, acc: 94.0286, loss_bbox: 0.2071, loss_mask: 0.2157, loss: 0.6309 2024-05-29 22:27:14,994 - mmdet - INFO - Epoch [9][2700/7330] lr: 1.000e-05, eta: 5:06:11, time: 0.662, data_time: 0.038, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0383, loss_cls: 0.1501, acc: 94.4106, loss_bbox: 0.2010, loss_mask: 0.2147, loss: 0.6176 2024-05-29 22:27:48,170 - mmdet - INFO - Epoch [9][2750/7330] lr: 1.000e-05, eta: 5:05:36, time: 0.664, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0375, loss_cls: 0.1486, acc: 94.3899, loss_bbox: 0.2005, loss_mask: 0.2150, loss: 0.6144 2024-05-29 22:28:21,066 - mmdet - INFO - Epoch [9][2800/7330] lr: 1.000e-05, eta: 5:05:01, time: 0.658, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0385, loss_cls: 0.1543, acc: 94.1755, loss_bbox: 0.2006, loss_mask: 0.2135, loss: 0.6204 2024-05-29 22:28:56,822 - mmdet - INFO - Epoch [9][2850/7330] lr: 1.000e-05, eta: 5:04:27, time: 0.715, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0393, loss_cls: 0.1557, acc: 94.1646, loss_bbox: 0.2044, loss_mask: 0.2164, loss: 0.6300 2024-05-29 22:29:36,893 - mmdet - INFO - Epoch [9][2900/7330] lr: 1.000e-05, eta: 5:03:55, time: 0.801, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0397, loss_cls: 0.1564, acc: 94.0874, loss_bbox: 0.2070, loss_mask: 0.2177, loss: 0.6345 2024-05-29 22:30:10,172 - mmdet - INFO - Epoch [9][2950/7330] lr: 1.000e-05, eta: 5:03:20, time: 0.665, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0399, loss_cls: 0.1530, acc: 94.2903, loss_bbox: 0.2015, loss_mask: 0.2192, loss: 0.6275 2024-05-29 22:30:43,076 - mmdet - INFO - Epoch [9][3000/7330] lr: 1.000e-05, eta: 5:02:44, time: 0.658, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0381, loss_cls: 0.1554, acc: 94.1335, loss_bbox: 0.2072, loss_mask: 0.2171, loss: 0.6307 2024-05-29 22:31:17,066 - mmdet - INFO - Epoch [9][3050/7330] lr: 1.000e-05, eta: 5:02:10, time: 0.680, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0390, loss_cls: 0.1533, acc: 94.1951, loss_bbox: 0.2031, loss_mask: 0.2134, loss: 0.6219 2024-05-29 22:31:51,164 - mmdet - INFO - Epoch [9][3100/7330] lr: 1.000e-05, eta: 5:01:35, time: 0.682, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0394, loss_cls: 0.1529, acc: 94.1980, loss_bbox: 0.2083, loss_mask: 0.2207, loss: 0.6347 2024-05-29 22:32:25,095 - mmdet - INFO - Epoch [9][3150/7330] lr: 1.000e-05, eta: 5:01:00, time: 0.679, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0412, loss_cls: 0.1629, acc: 93.8425, loss_bbox: 0.2180, loss_mask: 0.2246, loss: 0.6611 2024-05-29 22:32:59,208 - mmdet - INFO - Epoch [9][3200/7330] lr: 1.000e-05, eta: 5:00:26, time: 0.682, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0401, loss_cls: 0.1486, acc: 94.4092, loss_bbox: 0.1989, loss_mask: 0.2163, loss: 0.6169 2024-05-29 22:33:33,012 - mmdet - INFO - Epoch [9][3250/7330] lr: 1.000e-05, eta: 4:59:51, time: 0.676, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0395, loss_cls: 0.1548, acc: 94.1626, loss_bbox: 0.2076, loss_mask: 0.2138, loss: 0.6296 2024-05-29 22:34:06,716 - mmdet - INFO - Epoch [9][3300/7330] lr: 1.000e-05, eta: 4:59:16, time: 0.674, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0384, loss_cls: 0.1584, acc: 93.9995, loss_bbox: 0.2067, loss_mask: 0.2215, loss: 0.6391 2024-05-29 22:34:40,013 - mmdet - INFO - Epoch [9][3350/7330] lr: 1.000e-05, eta: 4:58:41, time: 0.666, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0370, loss_cls: 0.1550, acc: 94.1409, loss_bbox: 0.2054, loss_mask: 0.2141, loss: 0.6242 2024-05-29 22:35:16,632 - mmdet - INFO - Epoch [9][3400/7330] lr: 1.000e-05, eta: 4:58:07, time: 0.732, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0406, loss_cls: 0.1567, acc: 94.0947, loss_bbox: 0.2038, loss_mask: 0.2173, loss: 0.6312 2024-05-29 22:35:50,419 - mmdet - INFO - Epoch [9][3450/7330] lr: 1.000e-05, eta: 4:57:32, time: 0.676, data_time: 0.064, memory: 11628, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0418, loss_cls: 0.1632, acc: 93.7852, loss_bbox: 0.2158, loss_mask: 0.2221, loss: 0.6571 2024-05-29 22:36:28,288 - mmdet - INFO - Epoch [9][3500/7330] lr: 1.000e-05, eta: 4:56:59, time: 0.757, data_time: 0.037, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0392, loss_cls: 0.1522, acc: 94.2114, loss_bbox: 0.2031, loss_mask: 0.2119, loss: 0.6185 2024-05-29 22:37:03,646 - mmdet - INFO - Epoch [9][3550/7330] lr: 1.000e-05, eta: 4:56:25, time: 0.707, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0385, loss_cls: 0.1511, acc: 94.2307, loss_bbox: 0.2058, loss_mask: 0.2138, loss: 0.6232 2024-05-29 22:37:37,040 - mmdet - INFO - Epoch [9][3600/7330] lr: 1.000e-05, eta: 4:55:50, time: 0.668, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0400, loss_cls: 0.1567, acc: 94.0378, loss_bbox: 0.2117, loss_mask: 0.2160, loss: 0.6379 2024-05-29 22:38:10,281 - mmdet - INFO - Epoch [9][3650/7330] lr: 1.000e-05, eta: 4:55:15, time: 0.665, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0370, loss_cls: 0.1527, acc: 94.2466, loss_bbox: 0.2030, loss_mask: 0.2110, loss: 0.6168 2024-05-29 22:38:46,455 - mmdet - INFO - Epoch [9][3700/7330] lr: 1.000e-05, eta: 4:54:41, time: 0.723, data_time: 0.066, memory: 11628, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0419, loss_cls: 0.1651, acc: 93.6831, loss_bbox: 0.2202, loss_mask: 0.2242, loss: 0.6671 2024-05-29 22:39:20,452 - mmdet - INFO - Epoch [9][3750/7330] lr: 1.000e-05, eta: 4:54:07, time: 0.680, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0394, loss_cls: 0.1577, acc: 93.9749, loss_bbox: 0.2112, loss_mask: 0.2178, loss: 0.6395 2024-05-29 22:40:00,427 - mmdet - INFO - Epoch [9][3800/7330] lr: 1.000e-05, eta: 4:53:34, time: 0.799, data_time: 0.038, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0378, loss_cls: 0.1517, acc: 94.3740, loss_bbox: 0.2034, loss_mask: 0.2191, loss: 0.6250 2024-05-29 22:40:33,503 - mmdet - INFO - Epoch [9][3850/7330] lr: 1.000e-05, eta: 4:52:59, time: 0.662, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0372, loss_cls: 0.1527, acc: 94.2725, loss_bbox: 0.2021, loss_mask: 0.2149, loss: 0.6201 2024-05-29 22:41:06,830 - mmdet - INFO - Epoch [9][3900/7330] lr: 1.000e-05, eta: 4:52:24, time: 0.666, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0389, loss_cls: 0.1549, acc: 94.0588, loss_bbox: 0.2086, loss_mask: 0.2113, loss: 0.6270 2024-05-29 22:41:39,861 - mmdet - INFO - Epoch [9][3950/7330] lr: 1.000e-05, eta: 4:51:49, time: 0.661, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0362, loss_cls: 0.1497, acc: 94.3538, loss_bbox: 0.1990, loss_mask: 0.2134, loss: 0.6104 2024-05-29 22:42:12,918 - mmdet - INFO - Epoch [9][4000/7330] lr: 1.000e-05, eta: 4:51:14, time: 0.661, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0376, loss_cls: 0.1490, acc: 94.4375, loss_bbox: 0.1941, loss_mask: 0.2117, loss: 0.6051 2024-05-29 22:42:46,586 - mmdet - INFO - Epoch [9][4050/7330] lr: 1.000e-05, eta: 4:50:39, time: 0.673, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0397, loss_cls: 0.1552, acc: 94.2549, loss_bbox: 0.2073, loss_mask: 0.2180, loss: 0.6339 2024-05-29 22:43:19,845 - mmdet - INFO - Epoch [9][4100/7330] lr: 1.000e-05, eta: 4:50:04, time: 0.665, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0390, loss_cls: 0.1567, acc: 94.1443, loss_bbox: 0.2113, loss_mask: 0.2189, loss: 0.6400 2024-05-29 22:43:53,541 - mmdet - INFO - Epoch [9][4150/7330] lr: 1.000e-05, eta: 4:49:29, time: 0.674, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0414, loss_cls: 0.1654, acc: 93.6868, loss_bbox: 0.2238, loss_mask: 0.2242, loss: 0.6687 2024-05-29 22:44:26,910 - mmdet - INFO - Epoch [9][4200/7330] lr: 1.000e-05, eta: 4:48:54, time: 0.667, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0409, loss_cls: 0.1577, acc: 94.1045, loss_bbox: 0.2101, loss_mask: 0.2224, loss: 0.6458 2024-05-29 22:45:00,530 - mmdet - INFO - Epoch [9][4250/7330] lr: 1.000e-05, eta: 4:48:20, time: 0.672, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0377, loss_cls: 0.1538, acc: 94.1863, loss_bbox: 0.2041, loss_mask: 0.2143, loss: 0.6234 2024-05-29 22:45:36,028 - mmdet - INFO - Epoch [9][4300/7330] lr: 1.000e-05, eta: 4:47:45, time: 0.710, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0378, loss_cls: 0.1509, acc: 94.3059, loss_bbox: 0.2036, loss_mask: 0.2179, loss: 0.6242 2024-05-29 22:46:11,104 - mmdet - INFO - Epoch [9][4350/7330] lr: 1.000e-05, eta: 4:47:11, time: 0.702, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0370, loss_cls: 0.1516, acc: 94.2764, loss_bbox: 0.1982, loss_mask: 0.2112, loss: 0.6107 2024-05-29 22:46:49,467 - mmdet - INFO - Epoch [9][4400/7330] lr: 1.000e-05, eta: 4:46:38, time: 0.767, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0391, loss_cls: 0.1554, acc: 94.2161, loss_bbox: 0.2034, loss_mask: 0.2148, loss: 0.6266 2024-05-29 22:47:23,049 - mmdet - INFO - Epoch [9][4450/7330] lr: 1.000e-05, eta: 4:46:03, time: 0.672, data_time: 0.064, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0407, loss_cls: 0.1558, acc: 94.1357, loss_bbox: 0.2070, loss_mask: 0.2188, loss: 0.6356 2024-05-29 22:47:56,402 - mmdet - INFO - Epoch [9][4500/7330] lr: 1.000e-05, eta: 4:45:28, time: 0.667, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0393, loss_cls: 0.1595, acc: 93.8972, loss_bbox: 0.2113, loss_mask: 0.2174, loss: 0.6411 2024-05-29 22:48:29,123 - mmdet - INFO - Epoch [9][4550/7330] lr: 1.000e-05, eta: 4:44:53, time: 0.655, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0371, loss_cls: 0.1491, acc: 94.4163, loss_bbox: 0.1952, loss_mask: 0.2152, loss: 0.6087 2024-05-29 22:49:04,609 - mmdet - INFO - Epoch [9][4600/7330] lr: 1.000e-05, eta: 4:44:19, time: 0.710, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0402, loss_cls: 0.1557, acc: 94.1465, loss_bbox: 0.2068, loss_mask: 0.2158, loss: 0.6318 2024-05-29 22:49:40,182 - mmdet - INFO - Epoch [9][4650/7330] lr: 1.000e-05, eta: 4:43:45, time: 0.711, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0396, loss_cls: 0.1544, acc: 94.2554, loss_bbox: 0.2046, loss_mask: 0.2173, loss: 0.6303 2024-05-29 22:50:16,066 - mmdet - INFO - Epoch [9][4700/7330] lr: 1.000e-05, eta: 4:43:11, time: 0.718, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0381, loss_cls: 0.1542, acc: 94.3364, loss_bbox: 0.2023, loss_mask: 0.2132, loss: 0.6211 2024-05-29 22:50:49,594 - mmdet - INFO - Epoch [9][4750/7330] lr: 1.000e-05, eta: 4:42:36, time: 0.671, data_time: 0.065, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0415, loss_cls: 0.1646, acc: 93.7449, loss_bbox: 0.2178, loss_mask: 0.2182, loss: 0.6556 2024-05-29 22:51:22,645 - mmdet - INFO - Epoch [9][4800/7330] lr: 1.000e-05, eta: 4:42:01, time: 0.661, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0382, loss_cls: 0.1524, acc: 94.2891, loss_bbox: 0.2032, loss_mask: 0.2170, loss: 0.6232 2024-05-29 22:51:56,195 - mmdet - INFO - Epoch [9][4850/7330] lr: 1.000e-05, eta: 4:41:26, time: 0.671, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0369, loss_cls: 0.1486, acc: 94.3489, loss_bbox: 0.2030, loss_mask: 0.2149, loss: 0.6168 2024-05-29 22:52:29,667 - mmdet - INFO - Epoch [9][4900/7330] lr: 1.000e-05, eta: 4:40:51, time: 0.669, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0383, loss_cls: 0.1550, acc: 94.2119, loss_bbox: 0.2070, loss_mask: 0.2158, loss: 0.6307 2024-05-29 22:53:02,912 - mmdet - INFO - Epoch [9][4950/7330] lr: 1.000e-05, eta: 4:40:16, time: 0.665, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0386, loss_cls: 0.1533, acc: 94.2368, loss_bbox: 0.2048, loss_mask: 0.2168, loss: 0.6276 2024-05-29 22:53:36,469 - mmdet - INFO - Epoch [9][5000/7330] lr: 1.000e-05, eta: 4:39:41, time: 0.671, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0387, loss_cls: 0.1495, acc: 94.3413, loss_bbox: 0.2012, loss_mask: 0.2114, loss: 0.6137 2024-05-29 22:54:10,061 - mmdet - INFO - Epoch [9][5050/7330] lr: 1.000e-05, eta: 4:39:07, time: 0.672, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0411, loss_cls: 0.1536, acc: 94.1665, loss_bbox: 0.2081, loss_mask: 0.2153, loss: 0.6325 2024-05-29 22:54:43,468 - mmdet - INFO - Epoch [9][5100/7330] lr: 1.000e-05, eta: 4:38:32, time: 0.668, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0364, loss_cls: 0.1508, acc: 94.3455, loss_bbox: 0.1979, loss_mask: 0.2162, loss: 0.6140 2024-05-29 22:55:16,341 - mmdet - INFO - Epoch [9][5150/7330] lr: 1.000e-05, eta: 4:37:57, time: 0.657, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0392, loss_cls: 0.1510, acc: 94.2751, loss_bbox: 0.2014, loss_mask: 0.2079, loss: 0.6125 2024-05-29 22:55:51,595 - mmdet - INFO - Epoch [9][5200/7330] lr: 1.000e-05, eta: 4:37:22, time: 0.705, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0415, loss_cls: 0.1596, acc: 93.9766, loss_bbox: 0.2124, loss_mask: 0.2177, loss: 0.6456 2024-05-29 22:56:27,501 - mmdet - INFO - Epoch [9][5250/7330] lr: 1.000e-05, eta: 4:36:48, time: 0.718, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0390, loss_cls: 0.1534, acc: 94.2476, loss_bbox: 0.2051, loss_mask: 0.2184, loss: 0.6293 2024-05-29 22:57:05,143 - mmdet - INFO - Epoch [9][5300/7330] lr: 1.000e-05, eta: 4:36:15, time: 0.753, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0414, loss_cls: 0.1563, acc: 94.0361, loss_bbox: 0.2090, loss_mask: 0.2194, loss: 0.6397 2024-05-29 22:57:38,970 - mmdet - INFO - Epoch [9][5350/7330] lr: 1.000e-05, eta: 4:35:40, time: 0.677, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0396, loss_cls: 0.1529, acc: 94.1343, loss_bbox: 0.2078, loss_mask: 0.2186, loss: 0.6318 2024-05-29 22:58:12,325 - mmdet - INFO - Epoch [9][5400/7330] lr: 1.000e-05, eta: 4:35:05, time: 0.667, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0385, loss_cls: 0.1463, acc: 94.4189, loss_bbox: 0.1959, loss_mask: 0.2126, loss: 0.6062 2024-05-29 22:58:46,049 - mmdet - INFO - Epoch [9][5450/7330] lr: 1.000e-05, eta: 4:34:31, time: 0.675, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0412, loss_cls: 0.1543, acc: 94.1260, loss_bbox: 0.2060, loss_mask: 0.2154, loss: 0.6314 2024-05-29 22:59:21,370 - mmdet - INFO - Epoch [9][5500/7330] lr: 1.000e-05, eta: 4:33:56, time: 0.706, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0376, loss_cls: 0.1513, acc: 94.3074, loss_bbox: 0.2013, loss_mask: 0.2138, loss: 0.6174 2024-05-29 23:00:01,733 - mmdet - INFO - Epoch [9][5550/7330] lr: 1.000e-05, eta: 4:33:24, time: 0.807, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0413, loss_cls: 0.1647, acc: 93.8232, loss_bbox: 0.2144, loss_mask: 0.2193, loss: 0.6543 2024-05-29 23:00:35,593 - mmdet - INFO - Epoch [9][5600/7330] lr: 1.000e-05, eta: 4:32:49, time: 0.677, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0392, loss_cls: 0.1522, acc: 94.2063, loss_bbox: 0.2049, loss_mask: 0.2185, loss: 0.6282 2024-05-29 23:01:08,654 - mmdet - INFO - Epoch [9][5650/7330] lr: 1.000e-05, eta: 4:32:14, time: 0.661, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0387, loss_cls: 0.1515, acc: 94.2178, loss_bbox: 0.2029, loss_mask: 0.2139, loss: 0.6200 2024-05-29 23:01:42,479 - mmdet - INFO - Epoch [9][5700/7330] lr: 1.000e-05, eta: 4:31:39, time: 0.677, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0393, loss_cls: 0.1564, acc: 94.1648, loss_bbox: 0.2035, loss_mask: 0.2139, loss: 0.6274 2024-05-29 23:02:15,575 - mmdet - INFO - Epoch [9][5750/7330] lr: 1.000e-05, eta: 4:31:04, time: 0.662, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0381, loss_cls: 0.1542, acc: 94.1177, loss_bbox: 0.2074, loss_mask: 0.2178, loss: 0.6304 2024-05-29 23:02:48,912 - mmdet - INFO - Epoch [9][5800/7330] lr: 1.000e-05, eta: 4:30:30, time: 0.667, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0385, loss_cls: 0.1479, acc: 94.4260, loss_bbox: 0.1977, loss_mask: 0.2183, loss: 0.6142 2024-05-29 23:03:22,700 - mmdet - INFO - Epoch [9][5850/7330] lr: 1.000e-05, eta: 4:29:55, time: 0.676, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0400, loss_cls: 0.1561, acc: 94.2168, loss_bbox: 0.2012, loss_mask: 0.2207, loss: 0.6319 2024-05-29 23:03:55,687 - mmdet - INFO - Epoch [9][5900/7330] lr: 1.000e-05, eta: 4:29:20, time: 0.660, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0391, loss_cls: 0.1478, acc: 94.3459, loss_bbox: 0.2013, loss_mask: 0.2190, loss: 0.6207 2024-05-29 23:04:29,623 - mmdet - INFO - Epoch [9][5950/7330] lr: 1.000e-05, eta: 4:28:45, time: 0.679, data_time: 0.069, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0409, loss_cls: 0.1528, acc: 94.2278, loss_bbox: 0.2096, loss_mask: 0.2166, loss: 0.6323 2024-05-29 23:05:03,343 - mmdet - INFO - Epoch [9][6000/7330] lr: 1.000e-05, eta: 4:28:10, time: 0.674, data_time: 0.063, memory: 11628, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0391, loss_cls: 0.1542, acc: 94.0737, loss_bbox: 0.2078, loss_mask: 0.2163, loss: 0.6311 2024-05-29 23:05:39,215 - mmdet - INFO - Epoch [9][6050/7330] lr: 1.000e-05, eta: 4:27:36, time: 0.717, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0390, loss_cls: 0.1576, acc: 94.0020, loss_bbox: 0.2069, loss_mask: 0.2216, loss: 0.6388 2024-05-29 23:06:12,812 - mmdet - INFO - Epoch [9][6100/7330] lr: 1.000e-05, eta: 4:27:01, time: 0.672, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0397, loss_cls: 0.1505, acc: 94.1587, loss_bbox: 0.2040, loss_mask: 0.2178, loss: 0.6253 2024-05-29 23:06:52,907 - mmdet - INFO - Epoch [9][6150/7330] lr: 1.000e-05, eta: 4:26:29, time: 0.802, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0375, loss_cls: 0.1504, acc: 94.3174, loss_bbox: 0.1987, loss_mask: 0.2102, loss: 0.6096 2024-05-29 23:07:26,008 - mmdet - INFO - Epoch [9][6200/7330] lr: 1.000e-05, eta: 4:25:54, time: 0.662, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0373, loss_cls: 0.1465, acc: 94.3860, loss_bbox: 0.1952, loss_mask: 0.2139, loss: 0.6061 2024-05-29 23:07:59,470 - mmdet - INFO - Epoch [9][6250/7330] lr: 1.000e-05, eta: 4:25:19, time: 0.669, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0423, loss_cls: 0.1595, acc: 94.0496, loss_bbox: 0.2096, loss_mask: 0.2214, loss: 0.6469 2024-05-29 23:08:32,363 - mmdet - INFO - Epoch [9][6300/7330] lr: 1.000e-05, eta: 4:24:44, time: 0.658, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0376, loss_cls: 0.1468, acc: 94.5342, loss_bbox: 0.1985, loss_mask: 0.2167, loss: 0.6117 2024-05-29 23:09:08,056 - mmdet - INFO - Epoch [9][6350/7330] lr: 1.000e-05, eta: 4:24:10, time: 0.714, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0402, loss_cls: 0.1561, acc: 94.0723, loss_bbox: 0.2089, loss_mask: 0.2162, loss: 0.6359 2024-05-29 23:09:41,561 - mmdet - INFO - Epoch [9][6400/7330] lr: 1.000e-05, eta: 4:23:35, time: 0.670, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0376, loss_cls: 0.1549, acc: 94.0835, loss_bbox: 0.2059, loss_mask: 0.2176, loss: 0.6297 2024-05-29 23:10:21,775 - mmdet - INFO - Epoch [9][6450/7330] lr: 1.000e-05, eta: 4:23:03, time: 0.804, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0378, loss_cls: 0.1498, acc: 94.4302, loss_bbox: 0.2008, loss_mask: 0.2190, loss: 0.6202 2024-05-29 23:10:55,312 - mmdet - INFO - Epoch [9][6500/7330] lr: 1.000e-05, eta: 4:22:28, time: 0.671, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0394, loss_cls: 0.1575, acc: 94.0281, loss_bbox: 0.2138, loss_mask: 0.2192, loss: 0.6439 2024-05-29 23:11:28,637 - mmdet - INFO - Epoch [9][6550/7330] lr: 1.000e-05, eta: 4:21:53, time: 0.666, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0388, loss_cls: 0.1522, acc: 94.2410, loss_bbox: 0.2039, loss_mask: 0.2140, loss: 0.6214 2024-05-29 23:12:01,823 - mmdet - INFO - Epoch [9][6600/7330] lr: 1.000e-05, eta: 4:21:18, time: 0.664, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0400, loss_cls: 0.1565, acc: 94.1086, loss_bbox: 0.2070, loss_mask: 0.2217, loss: 0.6388 2024-05-29 23:12:35,346 - mmdet - INFO - Epoch [9][6650/7330] lr: 1.000e-05, eta: 4:20:43, time: 0.670, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0381, loss_cls: 0.1539, acc: 94.2888, loss_bbox: 0.2036, loss_mask: 0.2190, loss: 0.6283 2024-05-29 23:13:08,916 - mmdet - INFO - Epoch [9][6700/7330] lr: 1.000e-05, eta: 4:20:08, time: 0.671, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0389, loss_cls: 0.1527, acc: 94.3159, loss_bbox: 0.2028, loss_mask: 0.2138, loss: 0.6208 2024-05-29 23:13:42,482 - mmdet - INFO - Epoch [9][6750/7330] lr: 1.000e-05, eta: 4:19:33, time: 0.671, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0357, loss_cls: 0.1445, acc: 94.4099, loss_bbox: 0.2002, loss_mask: 0.2139, loss: 0.6066 2024-05-29 23:14:16,063 - mmdet - INFO - Epoch [9][6800/7330] lr: 1.000e-05, eta: 4:18:58, time: 0.672, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0398, loss_cls: 0.1546, acc: 94.1929, loss_bbox: 0.2039, loss_mask: 0.2178, loss: 0.6299 2024-05-29 23:14:49,542 - mmdet - INFO - Epoch [9][6850/7330] lr: 1.000e-05, eta: 4:18:24, time: 0.670, data_time: 0.062, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0365, loss_cls: 0.1523, acc: 94.2468, loss_bbox: 0.2039, loss_mask: 0.2134, loss: 0.6187 2024-05-29 23:15:22,501 - mmdet - INFO - Epoch [9][6900/7330] lr: 1.000e-05, eta: 4:17:49, time: 0.659, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0361, loss_cls: 0.1499, acc: 94.3030, loss_bbox: 0.2030, loss_mask: 0.2158, loss: 0.6166 2024-05-29 23:15:58,664 - mmdet - INFO - Epoch [9][6950/7330] lr: 1.000e-05, eta: 4:17:15, time: 0.723, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0400, loss_cls: 0.1615, acc: 93.9707, loss_bbox: 0.2103, loss_mask: 0.2146, loss: 0.6400 2024-05-29 23:16:35,038 - mmdet - INFO - Epoch [9][7000/7330] lr: 1.000e-05, eta: 4:16:41, time: 0.727, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0396, loss_cls: 0.1562, acc: 94.0444, loss_bbox: 0.2105, loss_mask: 0.2155, loss: 0.6361 2024-05-29 23:17:12,936 - mmdet - INFO - Epoch [9][7050/7330] lr: 1.000e-05, eta: 4:16:07, time: 0.758, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0360, loss_cls: 0.1458, acc: 94.5215, loss_bbox: 0.1974, loss_mask: 0.2103, loss: 0.6021 2024-05-29 23:17:46,850 - mmdet - INFO - Epoch [9][7100/7330] lr: 1.000e-05, eta: 4:15:33, time: 0.678, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0408, loss_cls: 0.1554, acc: 94.0942, loss_bbox: 0.2071, loss_mask: 0.2160, loss: 0.6335 2024-05-29 23:18:20,207 - mmdet - INFO - Epoch [9][7150/7330] lr: 1.000e-05, eta: 4:14:58, time: 0.667, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0375, loss_cls: 0.1524, acc: 94.4006, loss_bbox: 0.1977, loss_mask: 0.2166, loss: 0.6162 2024-05-29 23:18:53,680 - mmdet - INFO - Epoch [9][7200/7330] lr: 1.000e-05, eta: 4:14:23, time: 0.669, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0394, loss_cls: 0.1548, acc: 94.1897, loss_bbox: 0.2036, loss_mask: 0.2176, loss: 0.6281 2024-05-29 23:19:29,550 - mmdet - INFO - Epoch [9][7250/7330] lr: 1.000e-05, eta: 4:13:49, time: 0.717, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0399, loss_cls: 0.1667, acc: 93.6636, loss_bbox: 0.2193, loss_mask: 0.2210, loss: 0.6603 2024-05-29 23:20:09,717 - mmdet - INFO - Epoch [9][7300/7330] lr: 1.000e-05, eta: 4:13:16, time: 0.803, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0387, loss_cls: 0.1596, acc: 94.0234, loss_bbox: 0.2106, loss_mask: 0.2207, loss: 0.6432 2024-05-29 23:20:32,715 - mmdet - INFO - Saving checkpoint at 9 epochs 2024-05-29 23:22:24,064 - mmdet - INFO - Evaluating bbox... 2024-05-29 23:22:44,530 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.460 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.681 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.498 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.283 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.504 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.626 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.578 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.578 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.578 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.389 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.626 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.747 2024-05-29 23:22:44,530 - mmdet - INFO - Evaluating segm... 2024-05-29 23:23:09,761 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.411 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.648 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.442 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.198 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.447 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.625 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.318 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.567 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.706 2024-05-29 23:23:10,062 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-29 23:23:10,064 - mmdet - INFO - Epoch(val) [9][625] bbox_mAP: 0.4600, bbox_mAP_50: 0.6810, bbox_mAP_75: 0.4980, bbox_mAP_s: 0.2830, bbox_mAP_m: 0.5040, bbox_mAP_l: 0.6260, bbox_mAP_copypaste: 0.460 0.681 0.498 0.283 0.504 0.626, segm_mAP: 0.4110, segm_mAP_50: 0.6480, segm_mAP_75: 0.4420, segm_mAP_s: 0.1980, segm_mAP_m: 0.4470, segm_mAP_l: 0.6250, segm_mAP_copypaste: 0.411 0.648 0.442 0.198 0.447 0.625 2024-05-29 23:23:52,499 - mmdet - INFO - Epoch [10][50/7330] lr: 1.000e-05, eta: 4:12:17, time: 0.848, data_time: 0.128, memory: 11628, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0399, loss_cls: 0.1564, acc: 94.0723, loss_bbox: 0.2081, loss_mask: 0.2172, loss: 0.6363 2024-05-29 23:24:26,403 - mmdet - INFO - Epoch [10][100/7330] lr: 1.000e-05, eta: 4:11:42, time: 0.678, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0409, loss_cls: 0.1521, acc: 94.2227, loss_bbox: 0.2073, loss_mask: 0.2205, loss: 0.6339 2024-05-29 23:25:00,128 - mmdet - INFO - Epoch [10][150/7330] lr: 1.000e-05, eta: 4:11:07, time: 0.675, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0379, loss_cls: 0.1548, acc: 94.0669, loss_bbox: 0.2044, loss_mask: 0.2188, loss: 0.6289 2024-05-29 23:25:33,549 - mmdet - INFO - Epoch [10][200/7330] lr: 1.000e-05, eta: 4:10:33, time: 0.668, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0372, loss_cls: 0.1413, acc: 94.6738, loss_bbox: 0.1896, loss_mask: 0.2107, loss: 0.5910 2024-05-29 23:26:06,809 - mmdet - INFO - Epoch [10][250/7330] lr: 1.000e-05, eta: 4:09:58, time: 0.665, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0352, loss_cls: 0.1457, acc: 94.5391, loss_bbox: 0.1960, loss_mask: 0.2162, loss: 0.6053 2024-05-29 23:26:40,388 - mmdet - INFO - Epoch [10][300/7330] lr: 1.000e-05, eta: 4:09:23, time: 0.672, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0397, loss_cls: 0.1452, acc: 94.4463, loss_bbox: 0.1985, loss_mask: 0.2107, loss: 0.6062 2024-05-29 23:27:13,589 - mmdet - INFO - Epoch [10][350/7330] lr: 1.000e-05, eta: 4:08:48, time: 0.664, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0368, loss_cls: 0.1474, acc: 94.4143, loss_bbox: 0.2000, loss_mask: 0.2180, loss: 0.6136 2024-05-29 23:27:47,421 - mmdet - INFO - Epoch [10][400/7330] lr: 1.000e-05, eta: 4:08:13, time: 0.677, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0396, loss_cls: 0.1536, acc: 94.2002, loss_bbox: 0.2073, loss_mask: 0.2168, loss: 0.6313 2024-05-29 23:28:20,656 - mmdet - INFO - Epoch [10][450/7330] lr: 1.000e-05, eta: 4:07:38, time: 0.665, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0377, loss_cls: 0.1537, acc: 94.2424, loss_bbox: 0.2063, loss_mask: 0.2195, loss: 0.6308 2024-05-29 23:28:53,573 - mmdet - INFO - Epoch [10][500/7330] lr: 1.000e-05, eta: 4:07:03, time: 0.658, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0373, loss_cls: 0.1481, acc: 94.4504, loss_bbox: 0.1995, loss_mask: 0.2113, loss: 0.6088 2024-05-29 23:29:27,168 - mmdet - INFO - Epoch [10][550/7330] lr: 1.000e-05, eta: 4:06:29, time: 0.672, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0386, loss_cls: 0.1474, acc: 94.3115, loss_bbox: 0.2042, loss_mask: 0.2154, loss: 0.6181 2024-05-29 23:30:01,159 - mmdet - INFO - Epoch [10][600/7330] lr: 1.000e-05, eta: 4:05:54, time: 0.680, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0383, loss_cls: 0.1547, acc: 94.1833, loss_bbox: 0.2047, loss_mask: 0.2182, loss: 0.6285 2024-05-29 23:30:34,610 - mmdet - INFO - Epoch [10][650/7330] lr: 1.000e-05, eta: 4:05:19, time: 0.669, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0367, loss_cls: 0.1468, acc: 94.3948, loss_bbox: 0.2009, loss_mask: 0.2140, loss: 0.6101 2024-05-29 23:31:07,966 - mmdet - INFO - Epoch [10][700/7330] lr: 1.000e-05, eta: 4:04:44, time: 0.667, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0406, loss_cls: 0.1524, acc: 94.2556, loss_bbox: 0.2091, loss_mask: 0.2171, loss: 0.6328 2024-05-29 23:31:42,061 - mmdet - INFO - Epoch [10][750/7330] lr: 1.000e-05, eta: 4:04:10, time: 0.682, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0365, loss_cls: 0.1425, acc: 94.6926, loss_bbox: 0.1914, loss_mask: 0.2094, loss: 0.5913 2024-05-29 23:32:14,632 - mmdet - INFO - Epoch [10][800/7330] lr: 1.000e-05, eta: 4:03:35, time: 0.651, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0379, loss_cls: 0.1483, acc: 94.3594, loss_bbox: 0.2004, loss_mask: 0.2179, loss: 0.6166 2024-05-29 23:32:48,535 - mmdet - INFO - Epoch [10][850/7330] lr: 1.000e-05, eta: 4:03:00, time: 0.678, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0384, loss_cls: 0.1583, acc: 93.9495, loss_bbox: 0.2081, loss_mask: 0.2142, loss: 0.6321 2024-05-29 23:33:21,935 - mmdet - INFO - Epoch [10][900/7330] lr: 1.000e-05, eta: 4:02:25, time: 0.668, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0378, loss_cls: 0.1497, acc: 94.2717, loss_bbox: 0.2016, loss_mask: 0.2166, loss: 0.6190 2024-05-29 23:33:57,551 - mmdet - INFO - Epoch [10][950/7330] lr: 1.000e-05, eta: 4:01:51, time: 0.712, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0384, loss_cls: 0.1492, acc: 94.4043, loss_bbox: 0.1955, loss_mask: 0.2142, loss: 0.6098 2024-05-29 23:34:31,448 - mmdet - INFO - Epoch [10][1000/7330] lr: 1.000e-05, eta: 4:01:16, time: 0.678, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0370, loss_cls: 0.1451, acc: 94.5017, loss_bbox: 0.1994, loss_mask: 0.2130, loss: 0.6071 2024-05-29 23:35:12,564 - mmdet - INFO - Epoch [10][1050/7330] lr: 1.000e-05, eta: 4:00:44, time: 0.822, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0397, loss_cls: 0.1499, acc: 94.2319, loss_bbox: 0.2021, loss_mask: 0.2119, loss: 0.6167 2024-05-29 23:35:48,732 - mmdet - INFO - Epoch [10][1100/7330] lr: 1.000e-05, eta: 4:00:10, time: 0.724, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0414, loss_cls: 0.1613, acc: 93.8601, loss_bbox: 0.2187, loss_mask: 0.2218, loss: 0.6575 2024-05-29 23:36:22,652 - mmdet - INFO - Epoch [10][1150/7330] lr: 1.000e-05, eta: 3:59:35, time: 0.678, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0367, loss_cls: 0.1474, acc: 94.3704, loss_bbox: 0.2012, loss_mask: 0.2163, loss: 0.6138 2024-05-29 23:37:03,648 - mmdet - INFO - Epoch [10][1200/7330] lr: 1.000e-05, eta: 3:59:03, time: 0.820, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0359, loss_cls: 0.1452, acc: 94.4275, loss_bbox: 0.1968, loss_mask: 0.2125, loss: 0.6027 2024-05-29 23:37:37,204 - mmdet - INFO - Epoch [10][1250/7330] lr: 1.000e-05, eta: 3:58:28, time: 0.671, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0377, loss_cls: 0.1462, acc: 94.4636, loss_bbox: 0.1995, loss_mask: 0.2162, loss: 0.6120 2024-05-29 23:38:10,739 - mmdet - INFO - Epoch [10][1300/7330] lr: 1.000e-05, eta: 3:57:53, time: 0.670, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0380, loss_cls: 0.1454, acc: 94.4199, loss_bbox: 0.1993, loss_mask: 0.2135, loss: 0.6098 2024-05-29 23:38:43,789 - mmdet - INFO - Epoch [10][1350/7330] lr: 1.000e-05, eta: 3:57:18, time: 0.661, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0372, loss_cls: 0.1415, acc: 94.5730, loss_bbox: 0.1934, loss_mask: 0.2106, loss: 0.5941 2024-05-29 23:39:17,503 - mmdet - INFO - Epoch [10][1400/7330] lr: 1.000e-05, eta: 3:56:44, time: 0.674, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0386, loss_cls: 0.1555, acc: 94.0750, loss_bbox: 0.2108, loss_mask: 0.2194, loss: 0.6383 2024-05-29 23:39:51,205 - mmdet - INFO - Epoch [10][1450/7330] lr: 1.000e-05, eta: 3:56:09, time: 0.674, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0383, loss_cls: 0.1544, acc: 94.1687, loss_bbox: 0.2071, loss_mask: 0.2128, loss: 0.6260 2024-05-29 23:40:26,513 - mmdet - INFO - Epoch [10][1500/7330] lr: 1.000e-05, eta: 3:55:35, time: 0.706, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0426, loss_cls: 0.1588, acc: 94.0205, loss_bbox: 0.2082, loss_mask: 0.2163, loss: 0.6401 2024-05-29 23:41:00,224 - mmdet - INFO - Epoch [10][1550/7330] lr: 1.000e-05, eta: 3:55:00, time: 0.674, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0389, loss_cls: 0.1586, acc: 94.0347, loss_bbox: 0.2094, loss_mask: 0.2178, loss: 0.6391 2024-05-29 23:41:34,012 - mmdet - INFO - Epoch [10][1600/7330] lr: 1.000e-05, eta: 3:54:25, time: 0.676, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0390, loss_cls: 0.1541, acc: 94.0796, loss_bbox: 0.2061, loss_mask: 0.2145, loss: 0.6266 2024-05-29 23:42:08,028 - mmdet - INFO - Epoch [10][1650/7330] lr: 1.000e-05, eta: 3:53:50, time: 0.680, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0436, loss_cls: 0.1556, acc: 94.0603, loss_bbox: 0.2087, loss_mask: 0.2202, loss: 0.6412 2024-05-29 23:42:41,183 - mmdet - INFO - Epoch [10][1700/7330] lr: 1.000e-05, eta: 3:53:16, time: 0.663, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0381, loss_cls: 0.1487, acc: 94.4211, loss_bbox: 0.2027, loss_mask: 0.2180, loss: 0.6200 2024-05-29 23:43:14,353 - mmdet - INFO - Epoch [10][1750/7330] lr: 1.000e-05, eta: 3:52:41, time: 0.663, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0385, loss_cls: 0.1550, acc: 94.1848, loss_bbox: 0.2046, loss_mask: 0.2145, loss: 0.6262 2024-05-29 23:43:48,413 - mmdet - INFO - Epoch [10][1800/7330] lr: 1.000e-05, eta: 3:52:06, time: 0.681, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0412, loss_cls: 0.1533, acc: 94.1609, loss_bbox: 0.2082, loss_mask: 0.2147, loss: 0.6296 2024-05-29 23:44:24,115 - mmdet - INFO - Epoch [10][1850/7330] lr: 1.000e-05, eta: 3:51:32, time: 0.714, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0383, loss_cls: 0.1472, acc: 94.3782, loss_bbox: 0.1994, loss_mask: 0.2093, loss: 0.6061 2024-05-29 23:45:01,634 - mmdet - INFO - Epoch [10][1900/7330] lr: 1.000e-05, eta: 3:50:58, time: 0.750, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0401, loss_cls: 0.1543, acc: 94.1685, loss_bbox: 0.2118, loss_mask: 0.2205, loss: 0.6404 2024-05-29 23:45:41,484 - mmdet - INFO - Epoch [10][1950/7330] lr: 1.000e-05, eta: 3:50:25, time: 0.797, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0399, loss_cls: 0.1535, acc: 94.1438, loss_bbox: 0.2084, loss_mask: 0.2169, loss: 0.6336 2024-05-29 23:46:14,767 - mmdet - INFO - Epoch [10][2000/7330] lr: 1.000e-05, eta: 3:49:51, time: 0.666, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0375, loss_cls: 0.1494, acc: 94.3916, loss_bbox: 0.1987, loss_mask: 0.2090, loss: 0.6075 2024-05-29 23:46:49,840 - mmdet - INFO - Epoch [10][2050/7330] lr: 1.000e-05, eta: 3:49:16, time: 0.701, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0367, loss_cls: 0.1458, acc: 94.4138, loss_bbox: 0.1984, loss_mask: 0.2076, loss: 0.6010 2024-05-29 23:47:29,783 - mmdet - INFO - Epoch [10][2100/7330] lr: 1.000e-05, eta: 3:48:43, time: 0.799, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0383, loss_cls: 0.1510, acc: 94.2969, loss_bbox: 0.2050, loss_mask: 0.2158, loss: 0.6233 2024-05-29 23:48:03,472 - mmdet - INFO - Epoch [10][2150/7330] lr: 1.000e-05, eta: 3:48:09, time: 0.674, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0404, loss_cls: 0.1571, acc: 94.0535, loss_bbox: 0.2126, loss_mask: 0.2195, loss: 0.6430 2024-05-29 23:48:37,165 - mmdet - INFO - Epoch [10][2200/7330] lr: 1.000e-05, eta: 3:47:34, time: 0.674, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0404, loss_cls: 0.1501, acc: 94.2751, loss_bbox: 0.2074, loss_mask: 0.2161, loss: 0.6271 2024-05-29 23:49:10,706 - mmdet - INFO - Epoch [10][2250/7330] lr: 1.000e-05, eta: 3:46:59, time: 0.671, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0376, loss_cls: 0.1486, acc: 94.3608, loss_bbox: 0.2020, loss_mask: 0.2158, loss: 0.6172 2024-05-29 23:49:44,087 - mmdet - INFO - Epoch [10][2300/7330] lr: 1.000e-05, eta: 3:46:24, time: 0.668, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0414, loss_cls: 0.1552, acc: 94.0801, loss_bbox: 0.2081, loss_mask: 0.2220, loss: 0.6412 2024-05-29 23:50:18,075 - mmdet - INFO - Epoch [10][2350/7330] lr: 1.000e-05, eta: 3:45:50, time: 0.680, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0401, loss_cls: 0.1526, acc: 94.1260, loss_bbox: 0.2099, loss_mask: 0.2190, loss: 0.6344 2024-05-29 23:50:51,172 - mmdet - INFO - Epoch [10][2400/7330] lr: 1.000e-05, eta: 3:45:15, time: 0.662, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0376, loss_cls: 0.1490, acc: 94.4614, loss_bbox: 0.1972, loss_mask: 0.2115, loss: 0.6083 2024-05-29 23:51:24,883 - mmdet - INFO - Epoch [10][2450/7330] lr: 1.000e-05, eta: 3:44:40, time: 0.674, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0372, loss_cls: 0.1500, acc: 94.3442, loss_bbox: 0.2011, loss_mask: 0.2151, loss: 0.6165 2024-05-29 23:51:58,045 - mmdet - INFO - Epoch [10][2500/7330] lr: 1.000e-05, eta: 3:44:05, time: 0.663, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0377, loss_cls: 0.1447, acc: 94.3860, loss_bbox: 0.1981, loss_mask: 0.2089, loss: 0.6017 2024-05-29 23:52:31,897 - mmdet - INFO - Epoch [10][2550/7330] lr: 1.000e-05, eta: 3:43:30, time: 0.677, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0394, loss_cls: 0.1561, acc: 94.1313, loss_bbox: 0.2066, loss_mask: 0.2172, loss: 0.6333 2024-05-29 23:53:05,407 - mmdet - INFO - Epoch [10][2600/7330] lr: 1.000e-05, eta: 3:42:56, time: 0.670, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0392, loss_cls: 0.1520, acc: 94.1821, loss_bbox: 0.2052, loss_mask: 0.2154, loss: 0.6246 2024-05-29 23:53:39,847 - mmdet - INFO - Epoch [10][2650/7330] lr: 1.000e-05, eta: 3:42:21, time: 0.688, data_time: 0.070, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0396, loss_cls: 0.1511, acc: 94.2903, loss_bbox: 0.2025, loss_mask: 0.2172, loss: 0.6238 2024-05-29 23:54:15,821 - mmdet - INFO - Epoch [10][2700/7330] lr: 1.000e-05, eta: 3:41:47, time: 0.720, data_time: 0.066, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0393, loss_cls: 0.1580, acc: 94.0139, loss_bbox: 0.2107, loss_mask: 0.2209, loss: 0.6429 2024-05-29 23:54:49,126 - mmdet - INFO - Epoch [10][2750/7330] lr: 1.000e-05, eta: 3:41:12, time: 0.666, data_time: 0.062, memory: 11628, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0410, loss_cls: 0.1579, acc: 93.9329, loss_bbox: 0.2153, loss_mask: 0.2182, loss: 0.6461 2024-05-29 23:55:28,353 - mmdet - INFO - Epoch [10][2800/7330] lr: 1.000e-05, eta: 3:40:39, time: 0.785, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0390, loss_cls: 0.1570, acc: 93.9753, loss_bbox: 0.2081, loss_mask: 0.2151, loss: 0.6331 2024-05-29 23:56:06,179 - mmdet - INFO - Epoch [10][2850/7330] lr: 1.000e-05, eta: 3:40:06, time: 0.756, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0382, loss_cls: 0.1510, acc: 94.2358, loss_bbox: 0.2019, loss_mask: 0.2123, loss: 0.6167 2024-05-29 23:56:39,693 - mmdet - INFO - Epoch [10][2900/7330] lr: 1.000e-05, eta: 3:39:31, time: 0.670, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0390, loss_cls: 0.1473, acc: 94.4036, loss_bbox: 0.1998, loss_mask: 0.2116, loss: 0.6113 2024-05-29 23:57:17,321 - mmdet - INFO - Epoch [10][2950/7330] lr: 1.000e-05, eta: 3:38:57, time: 0.753, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0383, loss_cls: 0.1514, acc: 94.3127, loss_bbox: 0.2043, loss_mask: 0.2133, loss: 0.6210 2024-05-29 23:57:52,500 - mmdet - INFO - Epoch [10][3000/7330] lr: 1.000e-05, eta: 3:38:23, time: 0.704, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0361, loss_cls: 0.1505, acc: 94.3030, loss_bbox: 0.1992, loss_mask: 0.2142, loss: 0.6124 2024-05-29 23:58:26,126 - mmdet - INFO - Epoch [10][3050/7330] lr: 1.000e-05, eta: 3:37:48, time: 0.672, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0393, loss_cls: 0.1488, acc: 94.3176, loss_bbox: 0.2057, loss_mask: 0.2151, loss: 0.6215 2024-05-29 23:58:59,731 - mmdet - INFO - Epoch [10][3100/7330] lr: 1.000e-05, eta: 3:37:13, time: 0.672, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0426, loss_cls: 0.1636, acc: 93.8452, loss_bbox: 0.2148, loss_mask: 0.2150, loss: 0.6513 2024-05-29 23:59:33,116 - mmdet - INFO - Epoch [10][3150/7330] lr: 1.000e-05, eta: 3:36:39, time: 0.668, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0391, loss_cls: 0.1566, acc: 94.0688, loss_bbox: 0.2090, loss_mask: 0.2224, loss: 0.6410 2024-05-30 00:00:06,629 - mmdet - INFO - Epoch [10][3200/7330] lr: 1.000e-05, eta: 3:36:04, time: 0.670, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0372, loss_cls: 0.1524, acc: 94.1252, loss_bbox: 0.2027, loss_mask: 0.2166, loss: 0.6228 2024-05-30 00:00:39,772 - mmdet - INFO - Epoch [10][3250/7330] lr: 1.000e-05, eta: 3:35:29, time: 0.663, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0400, loss_cls: 0.1535, acc: 94.1614, loss_bbox: 0.2103, loss_mask: 0.2185, loss: 0.6346 2024-05-30 00:01:13,761 - mmdet - INFO - Epoch [10][3300/7330] lr: 1.000e-05, eta: 3:34:54, time: 0.680, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0376, loss_cls: 0.1487, acc: 94.3208, loss_bbox: 0.2015, loss_mask: 0.2124, loss: 0.6132 2024-05-30 00:01:47,249 - mmdet - INFO - Epoch [10][3350/7330] lr: 1.000e-05, eta: 3:34:20, time: 0.670, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0372, loss_cls: 0.1487, acc: 94.3044, loss_bbox: 0.2008, loss_mask: 0.2128, loss: 0.6114 2024-05-30 00:02:20,353 - mmdet - INFO - Epoch [10][3400/7330] lr: 1.000e-05, eta: 3:33:45, time: 0.662, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0370, loss_cls: 0.1443, acc: 94.4846, loss_bbox: 0.1956, loss_mask: 0.2077, loss: 0.5965 2024-05-30 00:02:53,844 - mmdet - INFO - Epoch [10][3450/7330] lr: 1.000e-05, eta: 3:33:10, time: 0.670, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0362, loss_cls: 0.1484, acc: 94.4092, loss_bbox: 0.1942, loss_mask: 0.2126, loss: 0.6043 2024-05-30 00:03:26,733 - mmdet - INFO - Epoch [10][3500/7330] lr: 1.000e-05, eta: 3:32:35, time: 0.658, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0353, loss_cls: 0.1399, acc: 94.6235, loss_bbox: 0.1935, loss_mask: 0.2075, loss: 0.5871 2024-05-30 00:04:00,287 - mmdet - INFO - Epoch [10][3550/7330] lr: 1.000e-05, eta: 3:32:00, time: 0.671, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0373, loss_cls: 0.1536, acc: 94.1179, loss_bbox: 0.2070, loss_mask: 0.2186, loss: 0.6299 2024-05-30 00:04:35,398 - mmdet - INFO - Epoch [10][3600/7330] lr: 1.000e-05, eta: 3:31:26, time: 0.702, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0370, loss_cls: 0.1473, acc: 94.4036, loss_bbox: 0.2004, loss_mask: 0.2129, loss: 0.6100 2024-05-30 00:05:08,928 - mmdet - INFO - Epoch [10][3650/7330] lr: 1.000e-05, eta: 3:30:51, time: 0.671, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0397, loss_cls: 0.1520, acc: 94.2344, loss_bbox: 0.2057, loss_mask: 0.2202, loss: 0.6318 2024-05-30 00:05:50,990 - mmdet - INFO - Epoch [10][3700/7330] lr: 1.000e-05, eta: 3:30:19, time: 0.841, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0374, loss_cls: 0.1453, acc: 94.4561, loss_bbox: 0.2005, loss_mask: 0.2134, loss: 0.6089 2024-05-30 00:06:24,994 - mmdet - INFO - Epoch [10][3750/7330] lr: 1.000e-05, eta: 3:29:44, time: 0.680, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0419, loss_cls: 0.1545, acc: 94.1565, loss_bbox: 0.2091, loss_mask: 0.2193, loss: 0.6373 2024-05-30 00:07:01,033 - mmdet - INFO - Epoch [10][3800/7330] lr: 1.000e-05, eta: 3:29:10, time: 0.721, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0416, loss_cls: 0.1577, acc: 93.9458, loss_bbox: 0.2144, loss_mask: 0.2235, loss: 0.6508 2024-05-30 00:07:39,356 - mmdet - INFO - Epoch [10][3850/7330] lr: 1.000e-05, eta: 3:28:36, time: 0.766, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0391, loss_cls: 0.1518, acc: 94.2085, loss_bbox: 0.2048, loss_mask: 0.2169, loss: 0.6264 2024-05-30 00:08:13,307 - mmdet - INFO - Epoch [10][3900/7330] lr: 1.000e-05, eta: 3:28:02, time: 0.679, data_time: 0.063, memory: 11628, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0401, loss_cls: 0.1572, acc: 94.0085, loss_bbox: 0.2132, loss_mask: 0.2227, loss: 0.6483 2024-05-30 00:08:47,016 - mmdet - INFO - Epoch [10][3950/7330] lr: 1.000e-05, eta: 3:27:27, time: 0.674, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0396, loss_cls: 0.1548, acc: 94.0562, loss_bbox: 0.2129, loss_mask: 0.2131, loss: 0.6335 2024-05-30 00:09:21,819 - mmdet - INFO - Epoch [10][4000/7330] lr: 1.000e-05, eta: 3:26:53, time: 0.696, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0418, loss_cls: 0.1559, acc: 94.0779, loss_bbox: 0.2062, loss_mask: 0.2184, loss: 0.6364 2024-05-30 00:09:55,001 - mmdet - INFO - Epoch [10][4050/7330] lr: 1.000e-05, eta: 3:26:18, time: 0.664, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0360, loss_cls: 0.1411, acc: 94.6106, loss_bbox: 0.1933, loss_mask: 0.2093, loss: 0.5908 2024-05-30 00:10:28,192 - mmdet - INFO - Epoch [10][4100/7330] lr: 1.000e-05, eta: 3:25:43, time: 0.664, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0377, loss_cls: 0.1441, acc: 94.4690, loss_bbox: 0.1963, loss_mask: 0.2123, loss: 0.6028 2024-05-30 00:11:02,256 - mmdet - INFO - Epoch [10][4150/7330] lr: 1.000e-05, eta: 3:25:08, time: 0.681, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0415, loss_cls: 0.1496, acc: 94.3813, loss_bbox: 0.2052, loss_mask: 0.2147, loss: 0.6240 2024-05-30 00:11:36,383 - mmdet - INFO - Epoch [10][4200/7330] lr: 1.000e-05, eta: 3:24:34, time: 0.683, data_time: 0.070, memory: 11628, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0400, loss_cls: 0.1542, acc: 94.1045, loss_bbox: 0.2110, loss_mask: 0.2163, loss: 0.6350 2024-05-30 00:12:09,763 - mmdet - INFO - Epoch [10][4250/7330] lr: 1.000e-05, eta: 3:23:59, time: 0.667, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0369, loss_cls: 0.1488, acc: 94.4019, loss_bbox: 0.2041, loss_mask: 0.2148, loss: 0.6174 2024-05-30 00:12:43,818 - mmdet - INFO - Epoch [10][4300/7330] lr: 1.000e-05, eta: 3:23:24, time: 0.681, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0405, loss_cls: 0.1543, acc: 94.2449, loss_bbox: 0.2088, loss_mask: 0.2151, loss: 0.6314 2024-05-30 00:13:17,276 - mmdet - INFO - Epoch [10][4350/7330] lr: 1.000e-05, eta: 3:22:50, time: 0.669, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0394, loss_cls: 0.1483, acc: 94.3481, loss_bbox: 0.2012, loss_mask: 0.2173, loss: 0.6191 2024-05-30 00:13:51,402 - mmdet - INFO - Epoch [10][4400/7330] lr: 1.000e-05, eta: 3:22:15, time: 0.683, data_time: 0.064, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0396, loss_cls: 0.1490, acc: 94.3105, loss_bbox: 0.2002, loss_mask: 0.2112, loss: 0.6133 2024-05-30 00:14:26,891 - mmdet - INFO - Epoch [10][4450/7330] lr: 1.000e-05, eta: 3:21:41, time: 0.710, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0348, loss_cls: 0.1397, acc: 94.6570, loss_bbox: 0.1897, loss_mask: 0.2069, loss: 0.5828 2024-05-30 00:14:59,669 - mmdet - INFO - Epoch [10][4500/7330] lr: 1.000e-05, eta: 3:21:06, time: 0.656, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0367, loss_cls: 0.1488, acc: 94.4175, loss_bbox: 0.2002, loss_mask: 0.2126, loss: 0.6099 2024-05-30 00:15:36,444 - mmdet - INFO - Epoch [10][4550/7330] lr: 1.000e-05, eta: 3:20:32, time: 0.736, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0403, loss_cls: 0.1526, acc: 94.2434, loss_bbox: 0.2029, loss_mask: 0.2129, loss: 0.6231 2024-05-30 00:16:15,992 - mmdet - INFO - Epoch [10][4600/7330] lr: 1.000e-05, eta: 3:19:59, time: 0.791, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0402, loss_cls: 0.1521, acc: 94.2310, loss_bbox: 0.2038, loss_mask: 0.2131, loss: 0.6217 2024-05-30 00:16:48,930 - mmdet - INFO - Epoch [10][4650/7330] lr: 1.000e-05, eta: 3:19:24, time: 0.659, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0384, loss_cls: 0.1539, acc: 94.1589, loss_bbox: 0.2059, loss_mask: 0.2164, loss: 0.6285 2024-05-30 00:17:26,832 - mmdet - INFO - Epoch [10][4700/7330] lr: 1.000e-05, eta: 3:18:50, time: 0.758, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0395, loss_cls: 0.1533, acc: 94.1763, loss_bbox: 0.2053, loss_mask: 0.2158, loss: 0.6265 2024-05-30 00:18:02,372 - mmdet - INFO - Epoch [10][4750/7330] lr: 1.000e-05, eta: 3:18:16, time: 0.711, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0370, loss_cls: 0.1440, acc: 94.5012, loss_bbox: 0.1974, loss_mask: 0.2108, loss: 0.6012 2024-05-30 00:18:35,971 - mmdet - INFO - Epoch [10][4800/7330] lr: 1.000e-05, eta: 3:17:41, time: 0.672, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0400, loss_cls: 0.1578, acc: 94.0132, loss_bbox: 0.2103, loss_mask: 0.2190, loss: 0.6401 2024-05-30 00:19:09,085 - mmdet - INFO - Epoch [10][4850/7330] lr: 1.000e-05, eta: 3:17:06, time: 0.662, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0366, loss_cls: 0.1467, acc: 94.5205, loss_bbox: 0.1972, loss_mask: 0.2166, loss: 0.6084 2024-05-30 00:19:42,161 - mmdet - INFO - Epoch [10][4900/7330] lr: 1.000e-05, eta: 3:16:31, time: 0.661, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0345, loss_cls: 0.1454, acc: 94.4858, loss_bbox: 0.1957, loss_mask: 0.2115, loss: 0.5991 2024-05-30 00:20:15,815 - mmdet - INFO - Epoch [10][4950/7330] lr: 1.000e-05, eta: 3:15:57, time: 0.673, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0386, loss_cls: 0.1436, acc: 94.5054, loss_bbox: 0.1985, loss_mask: 0.2063, loss: 0.5991 2024-05-30 00:20:49,490 - mmdet - INFO - Epoch [10][5000/7330] lr: 1.000e-05, eta: 3:15:22, time: 0.674, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0391, loss_cls: 0.1541, acc: 94.1223, loss_bbox: 0.2066, loss_mask: 0.2137, loss: 0.6285 2024-05-30 00:21:23,173 - mmdet - INFO - Epoch [10][5050/7330] lr: 1.000e-05, eta: 3:14:47, time: 0.674, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0373, loss_cls: 0.1450, acc: 94.4663, loss_bbox: 0.1965, loss_mask: 0.2171, loss: 0.6078 2024-05-30 00:21:56,436 - mmdet - INFO - Epoch [10][5100/7330] lr: 1.000e-05, eta: 3:14:13, time: 0.665, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0373, loss_cls: 0.1405, acc: 94.7124, loss_bbox: 0.1872, loss_mask: 0.2108, loss: 0.5881 2024-05-30 00:22:29,767 - mmdet - INFO - Epoch [10][5150/7330] lr: 1.000e-05, eta: 3:13:38, time: 0.667, data_time: 0.067, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0383, loss_cls: 0.1464, acc: 94.3840, loss_bbox: 0.1947, loss_mask: 0.2105, loss: 0.6027 2024-05-30 00:23:02,898 - mmdet - INFO - Epoch [10][5200/7330] lr: 1.000e-05, eta: 3:13:03, time: 0.663, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0372, loss_cls: 0.1477, acc: 94.3967, loss_bbox: 0.1965, loss_mask: 0.2169, loss: 0.6095 2024-05-30 00:23:37,266 - mmdet - INFO - Epoch [10][5250/7330] lr: 1.000e-05, eta: 3:12:28, time: 0.687, data_time: 0.065, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0387, loss_cls: 0.1498, acc: 94.2861, loss_bbox: 0.2030, loss_mask: 0.2208, loss: 0.6249 2024-05-30 00:24:10,539 - mmdet - INFO - Epoch [10][5300/7330] lr: 1.000e-05, eta: 3:11:54, time: 0.665, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0383, loss_cls: 0.1451, acc: 94.5417, loss_bbox: 0.1972, loss_mask: 0.2117, loss: 0.6051 2024-05-30 00:24:46,791 - mmdet - INFO - Epoch [10][5350/7330] lr: 1.000e-05, eta: 3:11:20, time: 0.725, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0391, loss_cls: 0.1533, acc: 94.1504, loss_bbox: 0.2087, loss_mask: 0.2183, loss: 0.6320 2024-05-30 00:25:20,123 - mmdet - INFO - Epoch [10][5400/7330] lr: 1.000e-05, eta: 3:10:45, time: 0.666, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0380, loss_cls: 0.1474, acc: 94.3965, loss_bbox: 0.1969, loss_mask: 0.2089, loss: 0.6039 2024-05-30 00:26:00,874 - mmdet - INFO - Epoch [10][5450/7330] lr: 1.000e-05, eta: 3:10:12, time: 0.815, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0402, loss_cls: 0.1517, acc: 94.1731, loss_bbox: 0.2056, loss_mask: 0.2132, loss: 0.6243 2024-05-30 00:26:36,576 - mmdet - INFO - Epoch [10][5500/7330] lr: 1.000e-05, eta: 3:09:38, time: 0.714, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0375, loss_cls: 0.1530, acc: 94.2092, loss_bbox: 0.2070, loss_mask: 0.2196, loss: 0.6303 2024-05-30 00:27:09,997 - mmdet - INFO - Epoch [10][5550/7330] lr: 1.000e-05, eta: 3:09:03, time: 0.668, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0392, loss_cls: 0.1454, acc: 94.4478, loss_bbox: 0.1905, loss_mask: 0.2123, loss: 0.6001 2024-05-30 00:27:49,806 - mmdet - INFO - Epoch [10][5600/7330] lr: 1.000e-05, eta: 3:08:30, time: 0.796, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0388, loss_cls: 0.1477, acc: 94.3828, loss_bbox: 0.2007, loss_mask: 0.2124, loss: 0.6119 2024-05-30 00:28:22,717 - mmdet - INFO - Epoch [10][5650/7330] lr: 1.000e-05, eta: 3:07:55, time: 0.658, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0371, loss_cls: 0.1502, acc: 94.3191, loss_bbox: 0.1961, loss_mask: 0.2110, loss: 0.6072 2024-05-30 00:28:56,453 - mmdet - INFO - Epoch [10][5700/7330] lr: 1.000e-05, eta: 3:07:20, time: 0.675, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0415, loss_cls: 0.1579, acc: 93.9990, loss_bbox: 0.2142, loss_mask: 0.2213, loss: 0.6488 2024-05-30 00:29:29,300 - mmdet - INFO - Epoch [10][5750/7330] lr: 1.000e-05, eta: 3:06:45, time: 0.657, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0376, loss_cls: 0.1438, acc: 94.5359, loss_bbox: 0.1961, loss_mask: 0.2107, loss: 0.5998 2024-05-30 00:30:02,669 - mmdet - INFO - Epoch [10][5800/7330] lr: 1.000e-05, eta: 3:06:10, time: 0.667, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0378, loss_cls: 0.1482, acc: 94.3613, loss_bbox: 0.1979, loss_mask: 0.2101, loss: 0.6062 2024-05-30 00:30:36,608 - mmdet - INFO - Epoch [10][5850/7330] lr: 1.000e-05, eta: 3:05:36, time: 0.679, data_time: 0.063, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0384, loss_cls: 0.1535, acc: 94.1477, loss_bbox: 0.2061, loss_mask: 0.2161, loss: 0.6273 2024-05-30 00:31:09,888 - mmdet - INFO - Epoch [10][5900/7330] lr: 1.000e-05, eta: 3:05:01, time: 0.666, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0403, loss_cls: 0.1525, acc: 94.2517, loss_bbox: 0.2022, loss_mask: 0.2175, loss: 0.6254 2024-05-30 00:31:43,063 - mmdet - INFO - Epoch [10][5950/7330] lr: 1.000e-05, eta: 3:04:26, time: 0.663, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0371, loss_cls: 0.1517, acc: 94.3467, loss_bbox: 0.1974, loss_mask: 0.2157, loss: 0.6143 2024-05-30 00:32:16,565 - mmdet - INFO - Epoch [10][6000/7330] lr: 1.000e-05, eta: 3:03:51, time: 0.670, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0365, loss_cls: 0.1478, acc: 94.3184, loss_bbox: 0.1995, loss_mask: 0.2147, loss: 0.6105 2024-05-30 00:32:50,078 - mmdet - INFO - Epoch [10][6050/7330] lr: 1.000e-05, eta: 3:03:17, time: 0.670, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0368, loss_cls: 0.1483, acc: 94.3711, loss_bbox: 0.1980, loss_mask: 0.2076, loss: 0.6030 2024-05-30 00:33:23,310 - mmdet - INFO - Epoch [10][6100/7330] lr: 1.000e-05, eta: 3:02:42, time: 0.665, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0355, loss_cls: 0.1407, acc: 94.6311, loss_bbox: 0.1948, loss_mask: 0.2094, loss: 0.5929 2024-05-30 00:33:56,635 - mmdet - INFO - Epoch [10][6150/7330] lr: 1.000e-05, eta: 3:02:07, time: 0.666, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0386, loss_cls: 0.1460, acc: 94.4331, loss_bbox: 0.1993, loss_mask: 0.2097, loss: 0.6068 2024-05-30 00:34:29,724 - mmdet - INFO - Epoch [10][6200/7330] lr: 1.000e-05, eta: 3:01:32, time: 0.662, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0375, loss_cls: 0.1490, acc: 94.3347, loss_bbox: 0.1994, loss_mask: 0.2149, loss: 0.6139 2024-05-30 00:35:05,468 - mmdet - INFO - Epoch [10][6250/7330] lr: 1.000e-05, eta: 3:00:58, time: 0.715, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0387, loss_cls: 0.1469, acc: 94.4707, loss_bbox: 0.1996, loss_mask: 0.2194, loss: 0.6182 2024-05-30 00:35:42,569 - mmdet - INFO - Epoch [10][6300/7330] lr: 1.000e-05, eta: 3:00:24, time: 0.742, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0391, loss_cls: 0.1537, acc: 94.1650, loss_bbox: 0.2106, loss_mask: 0.2196, loss: 0.6354 2024-05-30 00:36:22,974 - mmdet - INFO - Epoch [10][6350/7330] lr: 1.000e-05, eta: 2:59:51, time: 0.808, data_time: 0.065, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0389, loss_cls: 0.1535, acc: 94.2358, loss_bbox: 0.2055, loss_mask: 0.2155, loss: 0.6264 2024-05-30 00:36:55,937 - mmdet - INFO - Epoch [10][6400/7330] lr: 1.000e-05, eta: 2:59:16, time: 0.659, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0387, loss_cls: 0.1510, acc: 94.3174, loss_bbox: 0.2059, loss_mask: 0.2163, loss: 0.6251 2024-05-30 00:37:31,020 - mmdet - INFO - Epoch [10][6450/7330] lr: 1.000e-05, eta: 2:58:42, time: 0.702, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0382, loss_cls: 0.1472, acc: 94.4893, loss_bbox: 0.1991, loss_mask: 0.2144, loss: 0.6119 2024-05-30 00:38:06,420 - mmdet - INFO - Epoch [10][6500/7330] lr: 1.000e-05, eta: 2:58:08, time: 0.708, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0365, loss_cls: 0.1445, acc: 94.5010, loss_bbox: 0.1898, loss_mask: 0.2060, loss: 0.5896 2024-05-30 00:38:39,965 - mmdet - INFO - Epoch [10][6550/7330] lr: 1.000e-05, eta: 2:57:33, time: 0.671, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0397, loss_cls: 0.1525, acc: 94.2920, loss_bbox: 0.2008, loss_mask: 0.2135, loss: 0.6204 2024-05-30 00:39:13,088 - mmdet - INFO - Epoch [10][6600/7330] lr: 1.000e-05, eta: 2:56:58, time: 0.663, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0386, loss_cls: 0.1462, acc: 94.4768, loss_bbox: 0.1962, loss_mask: 0.2111, loss: 0.6047 2024-05-30 00:39:46,533 - mmdet - INFO - Epoch [10][6650/7330] lr: 1.000e-05, eta: 2:56:23, time: 0.669, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0387, loss_cls: 0.1453, acc: 94.4458, loss_bbox: 0.2003, loss_mask: 0.2115, loss: 0.6087 2024-05-30 00:40:20,087 - mmdet - INFO - Epoch [10][6700/7330] lr: 1.000e-05, eta: 2:55:49, time: 0.671, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0369, loss_cls: 0.1429, acc: 94.6096, loss_bbox: 0.1963, loss_mask: 0.2124, loss: 0.6000 2024-05-30 00:40:53,836 - mmdet - INFO - Epoch [10][6750/7330] lr: 1.000e-05, eta: 2:55:14, time: 0.675, data_time: 0.067, memory: 11628, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0392, loss_cls: 0.1523, acc: 94.1687, loss_bbox: 0.2047, loss_mask: 0.2169, loss: 0.6266 2024-05-30 00:41:26,766 - mmdet - INFO - Epoch [10][6800/7330] lr: 1.000e-05, eta: 2:54:39, time: 0.659, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0374, loss_cls: 0.1512, acc: 94.2153, loss_bbox: 0.2010, loss_mask: 0.2128, loss: 0.6140 2024-05-30 00:41:59,965 - mmdet - INFO - Epoch [10][6850/7330] lr: 1.000e-05, eta: 2:54:04, time: 0.664, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0384, loss_cls: 0.1478, acc: 94.4221, loss_bbox: 0.1993, loss_mask: 0.2111, loss: 0.6097 2024-05-30 00:42:33,265 - mmdet - INFO - Epoch [10][6900/7330] lr: 1.000e-05, eta: 2:53:30, time: 0.666, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0383, loss_cls: 0.1494, acc: 94.2351, loss_bbox: 0.2034, loss_mask: 0.2149, loss: 0.6186 2024-05-30 00:43:06,758 - mmdet - INFO - Epoch [10][6950/7330] lr: 1.000e-05, eta: 2:52:55, time: 0.670, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0370, loss_cls: 0.1464, acc: 94.3633, loss_bbox: 0.1992, loss_mask: 0.2115, loss: 0.6067 2024-05-30 00:43:39,906 - mmdet - INFO - Epoch [10][7000/7330] lr: 1.000e-05, eta: 2:52:20, time: 0.663, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0375, loss_cls: 0.1433, acc: 94.5571, loss_bbox: 0.1933, loss_mask: 0.2073, loss: 0.5931 2024-05-30 00:44:13,143 - mmdet - INFO - Epoch [10][7050/7330] lr: 1.000e-05, eta: 2:51:45, time: 0.665, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0371, loss_cls: 0.1471, acc: 94.4829, loss_bbox: 0.2010, loss_mask: 0.2156, loss: 0.6135 2024-05-30 00:44:48,304 - mmdet - INFO - Epoch [10][7100/7330] lr: 1.000e-05, eta: 2:51:11, time: 0.703, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0382, loss_cls: 0.1427, acc: 94.4658, loss_bbox: 0.1962, loss_mask: 0.2106, loss: 0.6005 2024-05-30 00:45:22,351 - mmdet - INFO - Epoch [10][7150/7330] lr: 1.000e-05, eta: 2:50:37, time: 0.681, data_time: 0.075, memory: 11628, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0416, loss_cls: 0.1580, acc: 93.9958, loss_bbox: 0.2100, loss_mask: 0.2137, loss: 0.6371 2024-05-30 00:46:00,921 - mmdet - INFO - Epoch [10][7200/7330] lr: 1.000e-05, eta: 2:50:03, time: 0.771, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0399, loss_cls: 0.1524, acc: 94.2925, loss_bbox: 0.2045, loss_mask: 0.2115, loss: 0.6222 2024-05-30 00:46:38,464 - mmdet - INFO - Epoch [10][7250/7330] lr: 1.000e-05, eta: 2:49:29, time: 0.751, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0351, loss_cls: 0.1477, acc: 94.3806, loss_bbox: 0.1961, loss_mask: 0.2096, loss: 0.6005 2024-05-30 00:47:12,078 - mmdet - INFO - Epoch [10][7300/7330] lr: 1.000e-05, eta: 2:48:54, time: 0.672, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0399, loss_cls: 0.1586, acc: 94.0127, loss_bbox: 0.2076, loss_mask: 0.2182, loss: 0.6386 2024-05-30 00:47:38,212 - mmdet - INFO - Saving checkpoint at 10 epochs 2024-05-30 00:49:28,143 - mmdet - INFO - Evaluating bbox... 2024-05-30 00:49:49,303 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.462 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.682 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.503 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.274 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.505 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.630 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.578 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.578 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.578 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.381 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.623 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.749 2024-05-30 00:49:49,304 - mmdet - INFO - Evaluating segm... 2024-05-30 00:50:12,488 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.412 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.648 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.445 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.191 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.447 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.624 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.307 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.565 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.709 2024-05-30 00:50:12,828 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-30 00:50:12,830 - mmdet - INFO - Epoch(val) [10][625] bbox_mAP: 0.4620, bbox_mAP_50: 0.6820, bbox_mAP_75: 0.5030, bbox_mAP_s: 0.2740, bbox_mAP_m: 0.5050, bbox_mAP_l: 0.6300, bbox_mAP_copypaste: 0.462 0.682 0.503 0.274 0.505 0.630, segm_mAP: 0.4120, segm_mAP_50: 0.6480, segm_mAP_75: 0.4450, segm_mAP_s: 0.1910, segm_mAP_m: 0.4470, segm_mAP_l: 0.6240, segm_mAP_copypaste: 0.412 0.648 0.445 0.191 0.447 0.624 2024-05-30 00:50:59,732 - mmdet - INFO - Epoch [11][50/7330] lr: 1.000e-05, eta: 2:47:57, time: 0.938, data_time: 0.119, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0391, loss_cls: 0.1481, acc: 94.3816, loss_bbox: 0.2026, loss_mask: 0.2142, loss: 0.6164 2024-05-30 00:51:33,434 - mmdet - INFO - Epoch [11][100/7330] lr: 1.000e-05, eta: 2:47:23, time: 0.674, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0383, loss_cls: 0.1521, acc: 94.1536, loss_bbox: 0.2041, loss_mask: 0.2150, loss: 0.6222 2024-05-30 00:52:07,428 - mmdet - INFO - Epoch [11][150/7330] lr: 1.000e-05, eta: 2:46:48, time: 0.680, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0403, loss_cls: 0.1489, acc: 94.2827, loss_bbox: 0.2006, loss_mask: 0.2112, loss: 0.6127 2024-05-30 00:52:41,362 - mmdet - INFO - Epoch [11][200/7330] lr: 1.000e-05, eta: 2:46:14, time: 0.679, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0379, loss_cls: 0.1464, acc: 94.4250, loss_bbox: 0.1999, loss_mask: 0.2115, loss: 0.6082 2024-05-30 00:53:15,343 - mmdet - INFO - Epoch [11][250/7330] lr: 1.000e-05, eta: 2:45:39, time: 0.680, data_time: 0.059, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0391, loss_cls: 0.1523, acc: 94.2854, loss_bbox: 0.2021, loss_mask: 0.2134, loss: 0.6201 2024-05-30 00:53:49,925 - mmdet - INFO - Epoch [11][300/7330] lr: 1.000e-05, eta: 2:45:05, time: 0.692, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0409, loss_cls: 0.1534, acc: 94.0806, loss_bbox: 0.2083, loss_mask: 0.2174, loss: 0.6335 2024-05-30 00:54:24,202 - mmdet - INFO - Epoch [11][350/7330] lr: 1.000e-05, eta: 2:44:30, time: 0.686, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0409, loss_cls: 0.1506, acc: 94.2834, loss_bbox: 0.2096, loss_mask: 0.2163, loss: 0.6304 2024-05-30 00:54:57,716 - mmdet - INFO - Epoch [11][400/7330] lr: 1.000e-05, eta: 2:43:55, time: 0.670, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0361, loss_cls: 0.1375, acc: 94.6589, loss_bbox: 0.1940, loss_mask: 0.2114, loss: 0.5907 2024-05-30 00:55:31,337 - mmdet - INFO - Epoch [11][450/7330] lr: 1.000e-05, eta: 2:43:21, time: 0.672, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0406, loss_cls: 0.1477, acc: 94.3242, loss_bbox: 0.1987, loss_mask: 0.2088, loss: 0.6084 2024-05-30 00:56:05,044 - mmdet - INFO - Epoch [11][500/7330] lr: 1.000e-05, eta: 2:42:46, time: 0.674, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0379, loss_cls: 0.1505, acc: 94.2961, loss_bbox: 0.1984, loss_mask: 0.2096, loss: 0.6086 2024-05-30 00:56:38,703 - mmdet - INFO - Epoch [11][550/7330] lr: 1.000e-05, eta: 2:42:11, time: 0.673, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0389, loss_cls: 0.1542, acc: 94.0999, loss_bbox: 0.2141, loss_mask: 0.2173, loss: 0.6365 2024-05-30 00:57:11,676 - mmdet - INFO - Epoch [11][600/7330] lr: 1.000e-05, eta: 2:41:37, time: 0.659, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0361, loss_cls: 0.1383, acc: 94.7993, loss_bbox: 0.1873, loss_mask: 0.2059, loss: 0.5784 2024-05-30 00:57:45,217 - mmdet - INFO - Epoch [11][650/7330] lr: 1.000e-05, eta: 2:41:02, time: 0.671, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0359, loss_cls: 0.1431, acc: 94.6370, loss_bbox: 0.1957, loss_mask: 0.2167, loss: 0.6034 2024-05-30 00:58:18,516 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-30 00:58:18,517 - mmdet - INFO - Epoch [11][700/7330] lr: 1.000e-05, eta: 2:40:27, time: 0.666, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0389, loss_cls: 0.1455, acc: 94.4209, loss_bbox: 0.1999, loss_mask: 0.2144, loss: 0.6111 2024-05-30 00:58:53,046 - mmdet - INFO - Epoch [11][750/7330] lr: 1.000e-05, eta: 2:39:53, time: 0.690, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0390, loss_cls: 0.1483, acc: 94.3208, loss_bbox: 0.1976, loss_mask: 0.2110, loss: 0.6084 2024-05-30 00:59:26,578 - mmdet - INFO - Epoch [11][800/7330] lr: 1.000e-05, eta: 2:39:18, time: 0.671, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0366, loss_cls: 0.1462, acc: 94.3774, loss_bbox: 0.1954, loss_mask: 0.2117, loss: 0.6025 2024-05-30 01:00:02,389 - mmdet - INFO - Epoch [11][850/7330] lr: 1.000e-05, eta: 2:38:44, time: 0.716, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0358, loss_cls: 0.1447, acc: 94.4590, loss_bbox: 0.1966, loss_mask: 0.2100, loss: 0.5996 2024-05-30 01:00:39,831 - mmdet - INFO - Epoch [11][900/7330] lr: 1.000e-05, eta: 2:38:10, time: 0.749, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0396, loss_cls: 0.1429, acc: 94.5168, loss_bbox: 0.1953, loss_mask: 0.2104, loss: 0.6017 2024-05-30 01:01:16,177 - mmdet - INFO - Epoch [11][950/7330] lr: 1.000e-05, eta: 2:37:36, time: 0.727, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0370, loss_cls: 0.1465, acc: 94.4072, loss_bbox: 0.1979, loss_mask: 0.2072, loss: 0.6011 2024-05-30 01:01:51,320 - mmdet - INFO - Epoch [11][1000/7330] lr: 1.000e-05, eta: 2:37:01, time: 0.703, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0365, loss_cls: 0.1493, acc: 94.3066, loss_bbox: 0.2009, loss_mask: 0.2095, loss: 0.6083 2024-05-30 01:02:29,644 - mmdet - INFO - Epoch [11][1050/7330] lr: 1.000e-05, eta: 2:36:28, time: 0.766, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0378, loss_cls: 0.1450, acc: 94.5210, loss_bbox: 0.1941, loss_mask: 0.2096, loss: 0.5989 2024-05-30 01:03:02,709 - mmdet - INFO - Epoch [11][1100/7330] lr: 1.000e-05, eta: 2:35:53, time: 0.661, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0385, loss_cls: 0.1539, acc: 94.0999, loss_bbox: 0.2119, loss_mask: 0.2183, loss: 0.6345 2024-05-30 01:03:35,426 - mmdet - INFO - Epoch [11][1150/7330] lr: 1.000e-05, eta: 2:35:18, time: 0.654, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0380, loss_cls: 0.1443, acc: 94.4856, loss_bbox: 0.1979, loss_mask: 0.2098, loss: 0.6023 2024-05-30 01:04:11,050 - mmdet - INFO - Epoch [11][1200/7330] lr: 1.000e-05, eta: 2:34:44, time: 0.713, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0358, loss_cls: 0.1377, acc: 94.7947, loss_bbox: 0.1939, loss_mask: 0.2097, loss: 0.5896 2024-05-30 01:04:44,264 - mmdet - INFO - Epoch [11][1250/7330] lr: 1.000e-05, eta: 2:34:09, time: 0.664, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0387, loss_cls: 0.1484, acc: 94.3882, loss_bbox: 0.2004, loss_mask: 0.2129, loss: 0.6139 2024-05-30 01:05:18,455 - mmdet - INFO - Epoch [11][1300/7330] lr: 1.000e-05, eta: 2:33:35, time: 0.684, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0370, loss_cls: 0.1462, acc: 94.4534, loss_bbox: 0.1964, loss_mask: 0.2107, loss: 0.6024 2024-05-30 01:05:51,914 - mmdet - INFO - Epoch [11][1350/7330] lr: 1.000e-05, eta: 2:33:00, time: 0.669, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0378, loss_cls: 0.1473, acc: 94.2925, loss_bbox: 0.2024, loss_mask: 0.2142, loss: 0.6134 2024-05-30 01:06:25,115 - mmdet - INFO - Epoch [11][1400/7330] lr: 1.000e-05, eta: 2:32:25, time: 0.664, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0398, loss_cls: 0.1457, acc: 94.4373, loss_bbox: 0.1993, loss_mask: 0.2146, loss: 0.6119 2024-05-30 01:06:59,059 - mmdet - INFO - Epoch [11][1450/7330] lr: 1.000e-05, eta: 2:31:51, time: 0.679, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0397, loss_cls: 0.1525, acc: 94.2073, loss_bbox: 0.2084, loss_mask: 0.2152, loss: 0.6289 2024-05-30 01:07:32,961 - mmdet - INFO - Epoch [11][1500/7330] lr: 1.000e-05, eta: 2:31:16, time: 0.678, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0405, loss_cls: 0.1513, acc: 94.2656, loss_bbox: 0.2047, loss_mask: 0.2135, loss: 0.6232 2024-05-30 01:08:07,057 - mmdet - INFO - Epoch [11][1550/7330] lr: 1.000e-05, eta: 2:30:41, time: 0.682, data_time: 0.038, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0376, loss_cls: 0.1464, acc: 94.4294, loss_bbox: 0.2018, loss_mask: 0.2116, loss: 0.6088 2024-05-30 01:08:40,472 - mmdet - INFO - Epoch [11][1600/7330] lr: 1.000e-05, eta: 2:30:07, time: 0.668, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0372, loss_cls: 0.1477, acc: 94.3730, loss_bbox: 0.1977, loss_mask: 0.2112, loss: 0.6059 2024-05-30 01:09:14,269 - mmdet - INFO - Epoch [11][1650/7330] lr: 1.000e-05, eta: 2:29:32, time: 0.676, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0379, loss_cls: 0.1451, acc: 94.4197, loss_bbox: 0.1987, loss_mask: 0.2147, loss: 0.6094 2024-05-30 01:09:48,239 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-30 01:09:48,239 - mmdet - INFO - Epoch [11][1700/7330] lr: 1.000e-05, eta: 2:28:58, time: 0.679, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0398, loss_cls: 0.1514, acc: 94.2307, loss_bbox: 0.2069, loss_mask: 0.2134, loss: 0.6237 2024-05-30 01:10:23,458 - mmdet - INFO - Epoch [11][1750/7330] lr: 1.000e-05, eta: 2:28:23, time: 0.704, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0383, loss_cls: 0.1425, acc: 94.6006, loss_bbox: 0.2000, loss_mask: 0.2130, loss: 0.6052 2024-05-30 01:11:02,157 - mmdet - INFO - Epoch [11][1800/7330] lr: 1.000e-05, eta: 2:27:49, time: 0.774, data_time: 0.064, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0385, loss_cls: 0.1541, acc: 94.1104, loss_bbox: 0.2068, loss_mask: 0.2201, loss: 0.6325 2024-05-30 01:11:38,116 - mmdet - INFO - Epoch [11][1850/7330] lr: 1.000e-05, eta: 2:27:15, time: 0.719, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0353, loss_cls: 0.1392, acc: 94.6509, loss_bbox: 0.1926, loss_mask: 0.2057, loss: 0.5843 2024-05-30 01:12:15,950 - mmdet - INFO - Epoch [11][1900/7330] lr: 1.000e-05, eta: 2:26:41, time: 0.757, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0358, loss_cls: 0.1385, acc: 94.7407, loss_bbox: 0.1938, loss_mask: 0.2083, loss: 0.5879 2024-05-30 01:12:51,614 - mmdet - INFO - Epoch [11][1950/7330] lr: 1.000e-05, eta: 2:26:07, time: 0.713, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0383, loss_cls: 0.1478, acc: 94.4331, loss_bbox: 0.2008, loss_mask: 0.2134, loss: 0.6125 2024-05-30 01:13:24,940 - mmdet - INFO - Epoch [11][2000/7330] lr: 1.000e-05, eta: 2:25:32, time: 0.666, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0377, loss_cls: 0.1416, acc: 94.6704, loss_bbox: 0.1921, loss_mask: 0.2134, loss: 0.5965 2024-05-30 01:13:57,814 - mmdet - INFO - Epoch [11][2050/7330] lr: 1.000e-05, eta: 2:24:58, time: 0.658, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0368, loss_cls: 0.1458, acc: 94.4729, loss_bbox: 0.1942, loss_mask: 0.2071, loss: 0.5961 2024-05-30 01:14:33,209 - mmdet - INFO - Epoch [11][2100/7330] lr: 1.000e-05, eta: 2:24:23, time: 0.708, data_time: 0.038, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0388, loss_cls: 0.1509, acc: 94.3003, loss_bbox: 0.2048, loss_mask: 0.2173, loss: 0.6246 2024-05-30 01:15:06,525 - mmdet - INFO - Epoch [11][2150/7330] lr: 1.000e-05, eta: 2:23:49, time: 0.666, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0369, loss_cls: 0.1421, acc: 94.5735, loss_bbox: 0.1940, loss_mask: 0.2086, loss: 0.5933 2024-05-30 01:15:40,681 - mmdet - INFO - Epoch [11][2200/7330] lr: 1.000e-05, eta: 2:23:14, time: 0.683, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0386, loss_cls: 0.1472, acc: 94.3928, loss_bbox: 0.2011, loss_mask: 0.2109, loss: 0.6098 2024-05-30 01:16:14,879 - mmdet - INFO - Epoch [11][2250/7330] lr: 1.000e-05, eta: 2:22:39, time: 0.684, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0393, loss_cls: 0.1454, acc: 94.4568, loss_bbox: 0.2026, loss_mask: 0.2131, loss: 0.6132 2024-05-30 01:16:48,015 - mmdet - INFO - Epoch [11][2300/7330] lr: 1.000e-05, eta: 2:22:05, time: 0.663, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0357, loss_cls: 0.1420, acc: 94.5962, loss_bbox: 0.1929, loss_mask: 0.2094, loss: 0.5919 2024-05-30 01:17:21,893 - mmdet - INFO - Epoch [11][2350/7330] lr: 1.000e-05, eta: 2:21:30, time: 0.678, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0375, loss_cls: 0.1444, acc: 94.5232, loss_bbox: 0.2019, loss_mask: 0.2153, loss: 0.6109 2024-05-30 01:17:56,062 - mmdet - INFO - Epoch [11][2400/7330] lr: 1.000e-05, eta: 2:20:56, time: 0.683, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0419, loss_cls: 0.1568, acc: 94.0205, loss_bbox: 0.2103, loss_mask: 0.2184, loss: 0.6416 2024-05-30 01:18:29,905 - mmdet - INFO - Epoch [11][2450/7330] lr: 1.000e-05, eta: 2:20:21, time: 0.677, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0378, loss_cls: 0.1472, acc: 94.3950, loss_bbox: 0.1952, loss_mask: 0.2094, loss: 0.6018 2024-05-30 01:19:03,238 - mmdet - INFO - Epoch [11][2500/7330] lr: 1.000e-05, eta: 2:19:46, time: 0.667, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0391, loss_cls: 0.1503, acc: 94.2334, loss_bbox: 0.2041, loss_mask: 0.2201, loss: 0.6274 2024-05-30 01:19:36,525 - mmdet - INFO - Epoch [11][2550/7330] lr: 1.000e-05, eta: 2:19:12, time: 0.666, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0375, loss_cls: 0.1493, acc: 94.3545, loss_bbox: 0.1987, loss_mask: 0.2136, loss: 0.6132 2024-05-30 01:20:09,920 - mmdet - INFO - Epoch [11][2600/7330] lr: 1.000e-05, eta: 2:18:37, time: 0.668, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0376, loss_cls: 0.1539, acc: 94.1501, loss_bbox: 0.2043, loss_mask: 0.2144, loss: 0.6227 2024-05-30 01:20:49,842 - mmdet - INFO - Epoch [11][2650/7330] lr: 1.000e-05, eta: 2:18:03, time: 0.799, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0386, loss_cls: 0.1473, acc: 94.3154, loss_bbox: 0.1986, loss_mask: 0.2067, loss: 0.6038 2024-05-30 01:21:25,321 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-30 01:21:25,321 - mmdet - INFO - Epoch [11][2700/7330] lr: 1.000e-05, eta: 2:17:29, time: 0.710, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0373, loss_cls: 0.1432, acc: 94.5586, loss_bbox: 0.1976, loss_mask: 0.2151, loss: 0.6050 2024-05-30 01:22:00,840 - mmdet - INFO - Epoch [11][2750/7330] lr: 1.000e-05, eta: 2:16:55, time: 0.710, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0375, loss_cls: 0.1449, acc: 94.4890, loss_bbox: 0.1976, loss_mask: 0.2098, loss: 0.6023 2024-05-30 01:22:38,398 - mmdet - INFO - Epoch [11][2800/7330] lr: 1.000e-05, eta: 2:16:21, time: 0.751, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0371, loss_cls: 0.1502, acc: 94.3274, loss_bbox: 0.1991, loss_mask: 0.2136, loss: 0.6131 2024-05-30 01:23:11,164 - mmdet - INFO - Epoch [11][2850/7330] lr: 1.000e-05, eta: 2:15:46, time: 0.655, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0374, loss_cls: 0.1442, acc: 94.4795, loss_bbox: 0.1980, loss_mask: 0.2135, loss: 0.6049 2024-05-30 01:23:44,461 - mmdet - INFO - Epoch [11][2900/7330] lr: 1.000e-05, eta: 2:15:11, time: 0.666, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0380, loss_cls: 0.1465, acc: 94.4060, loss_bbox: 0.1970, loss_mask: 0.2127, loss: 0.6068 2024-05-30 01:24:19,625 - mmdet - INFO - Epoch [11][2950/7330] lr: 1.000e-05, eta: 2:14:37, time: 0.703, data_time: 0.038, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0381, loss_cls: 0.1490, acc: 94.3142, loss_bbox: 0.2075, loss_mask: 0.2164, loss: 0.6235 2024-05-30 01:24:53,092 - mmdet - INFO - Epoch [11][3000/7330] lr: 1.000e-05, eta: 2:14:02, time: 0.669, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0395, loss_cls: 0.1573, acc: 94.0681, loss_bbox: 0.2068, loss_mask: 0.2197, loss: 0.6362 2024-05-30 01:25:26,426 - mmdet - INFO - Epoch [11][3050/7330] lr: 1.000e-05, eta: 2:13:28, time: 0.667, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0375, loss_cls: 0.1433, acc: 94.5354, loss_bbox: 0.1966, loss_mask: 0.2101, loss: 0.5996 2024-05-30 01:26:00,148 - mmdet - INFO - Epoch [11][3100/7330] lr: 1.000e-05, eta: 2:12:53, time: 0.674, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0385, loss_cls: 0.1483, acc: 94.3560, loss_bbox: 0.2010, loss_mask: 0.2126, loss: 0.6128 2024-05-30 01:26:33,063 - mmdet - INFO - Epoch [11][3150/7330] lr: 1.000e-05, eta: 2:12:18, time: 0.658, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0373, loss_cls: 0.1489, acc: 94.3442, loss_bbox: 0.1980, loss_mask: 0.2119, loss: 0.6074 2024-05-30 01:27:06,452 - mmdet - INFO - Epoch [11][3200/7330] lr: 1.000e-05, eta: 2:11:44, time: 0.668, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0351, loss_cls: 0.1378, acc: 94.7278, loss_bbox: 0.1891, loss_mask: 0.2086, loss: 0.5813 2024-05-30 01:27:40,467 - mmdet - INFO - Epoch [11][3250/7330] lr: 1.000e-05, eta: 2:11:09, time: 0.680, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0388, loss_cls: 0.1507, acc: 94.2698, loss_bbox: 0.2042, loss_mask: 0.2125, loss: 0.6193 2024-05-30 01:28:14,379 - mmdet - INFO - Epoch [11][3300/7330] lr: 1.000e-05, eta: 2:10:35, time: 0.678, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0395, loss_cls: 0.1582, acc: 94.0134, loss_bbox: 0.2125, loss_mask: 0.2154, loss: 0.6389 2024-05-30 01:28:48,031 - mmdet - INFO - Epoch [11][3350/7330] lr: 1.000e-05, eta: 2:10:00, time: 0.673, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0372, loss_cls: 0.1457, acc: 94.5210, loss_bbox: 0.1976, loss_mask: 0.2083, loss: 0.5996 2024-05-30 01:29:21,967 - mmdet - INFO - Epoch [11][3400/7330] lr: 1.000e-05, eta: 2:09:25, time: 0.678, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0391, loss_cls: 0.1434, acc: 94.5315, loss_bbox: 0.1988, loss_mask: 0.2092, loss: 0.6024 2024-05-30 01:29:55,928 - mmdet - INFO - Epoch [11][3450/7330] lr: 1.000e-05, eta: 2:08:51, time: 0.680, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0398, loss_cls: 0.1493, acc: 94.3552, loss_bbox: 0.2014, loss_mask: 0.2137, loss: 0.6166 2024-05-30 01:30:31,177 - mmdet - INFO - Epoch [11][3500/7330] lr: 1.000e-05, eta: 2:08:16, time: 0.705, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0378, loss_cls: 0.1483, acc: 94.3271, loss_bbox: 0.1988, loss_mask: 0.2135, loss: 0.6123 2024-05-30 01:31:08,631 - mmdet - INFO - Epoch [11][3550/7330] lr: 1.000e-05, eta: 2:07:42, time: 0.749, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0359, loss_cls: 0.1425, acc: 94.6670, loss_bbox: 0.1901, loss_mask: 0.2106, loss: 0.5905 2024-05-30 01:31:44,716 - mmdet - INFO - Epoch [11][3600/7330] lr: 1.000e-05, eta: 2:07:08, time: 0.722, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0372, loss_cls: 0.1419, acc: 94.6206, loss_bbox: 0.1956, loss_mask: 0.2054, loss: 0.5917 2024-05-30 01:32:22,454 - mmdet - INFO - Epoch [11][3650/7330] lr: 1.000e-05, eta: 2:06:34, time: 0.755, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0361, loss_cls: 0.1447, acc: 94.5356, loss_bbox: 0.1935, loss_mask: 0.2124, loss: 0.5987 2024-05-30 01:32:57,895 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-30 01:32:57,895 - mmdet - INFO - Epoch [11][3700/7330] lr: 1.000e-05, eta: 2:06:00, time: 0.709, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0373, loss_cls: 0.1414, acc: 94.5850, loss_bbox: 0.1919, loss_mask: 0.2068, loss: 0.5894 2024-05-30 01:33:31,356 - mmdet - INFO - Epoch [11][3750/7330] lr: 1.000e-05, eta: 2:05:25, time: 0.669, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0400, loss_cls: 0.1502, acc: 94.3594, loss_bbox: 0.2035, loss_mask: 0.2166, loss: 0.6242 2024-05-30 01:34:05,208 - mmdet - INFO - Epoch [11][3800/7330] lr: 1.000e-05, eta: 2:04:51, time: 0.677, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0367, loss_cls: 0.1364, acc: 94.7839, loss_bbox: 0.1838, loss_mask: 0.2055, loss: 0.5733 2024-05-30 01:34:41,323 - mmdet - INFO - Epoch [11][3850/7330] lr: 1.000e-05, eta: 2:04:16, time: 0.722, data_time: 0.063, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0389, loss_cls: 0.1449, acc: 94.5298, loss_bbox: 0.2014, loss_mask: 0.2191, loss: 0.6171 2024-05-30 01:35:14,714 - mmdet - INFO - Epoch [11][3900/7330] lr: 1.000e-05, eta: 2:03:42, time: 0.668, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0369, loss_cls: 0.1434, acc: 94.5684, loss_bbox: 0.1942, loss_mask: 0.2091, loss: 0.5957 2024-05-30 01:35:48,454 - mmdet - INFO - Epoch [11][3950/7330] lr: 1.000e-05, eta: 2:03:07, time: 0.675, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0388, loss_cls: 0.1549, acc: 94.1079, loss_bbox: 0.2070, loss_mask: 0.2168, loss: 0.6310 2024-05-30 01:36:22,129 - mmdet - INFO - Epoch [11][4000/7330] lr: 1.000e-05, eta: 2:02:32, time: 0.674, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0372, loss_cls: 0.1465, acc: 94.4656, loss_bbox: 0.1983, loss_mask: 0.2104, loss: 0.6055 2024-05-30 01:36:55,545 - mmdet - INFO - Epoch [11][4050/7330] lr: 1.000e-05, eta: 2:01:58, time: 0.668, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0401, loss_cls: 0.1523, acc: 94.2341, loss_bbox: 0.2055, loss_mask: 0.2135, loss: 0.6241 2024-05-30 01:37:28,870 - mmdet - INFO - Epoch [11][4100/7330] lr: 1.000e-05, eta: 2:01:23, time: 0.666, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0355, loss_cls: 0.1418, acc: 94.5510, loss_bbox: 0.1999, loss_mask: 0.2118, loss: 0.6004 2024-05-30 01:38:02,750 - mmdet - INFO - Epoch [11][4150/7330] lr: 1.000e-05, eta: 2:00:49, time: 0.678, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0377, loss_cls: 0.1460, acc: 94.4517, loss_bbox: 0.2009, loss_mask: 0.2151, loss: 0.6117 2024-05-30 01:38:36,873 - mmdet - INFO - Epoch [11][4200/7330] lr: 1.000e-05, eta: 2:00:14, time: 0.682, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0396, loss_cls: 0.1553, acc: 94.0886, loss_bbox: 0.2069, loss_mask: 0.2170, loss: 0.6314 2024-05-30 01:39:11,283 - mmdet - INFO - Epoch [11][4250/7330] lr: 1.000e-05, eta: 1:59:40, time: 0.688, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0408, loss_cls: 0.1532, acc: 94.2610, loss_bbox: 0.2060, loss_mask: 0.2182, loss: 0.6319 2024-05-30 01:39:45,470 - mmdet - INFO - Epoch [11][4300/7330] lr: 1.000e-05, eta: 1:59:05, time: 0.684, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0379, loss_cls: 0.1462, acc: 94.4290, loss_bbox: 0.1972, loss_mask: 0.2122, loss: 0.6061 2024-05-30 01:40:19,537 - mmdet - INFO - Epoch [11][4350/7330] lr: 1.000e-05, eta: 1:58:30, time: 0.681, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0407, loss_cls: 0.1503, acc: 94.2935, loss_bbox: 0.2048, loss_mask: 0.2147, loss: 0.6234 2024-05-30 01:41:00,760 - mmdet - INFO - Epoch [11][4400/7330] lr: 1.000e-05, eta: 1:57:57, time: 0.824, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0373, loss_cls: 0.1404, acc: 94.6528, loss_bbox: 0.1889, loss_mask: 0.2045, loss: 0.5836 2024-05-30 01:41:33,883 - mmdet - INFO - Epoch [11][4450/7330] lr: 1.000e-05, eta: 1:57:22, time: 0.662, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0356, loss_cls: 0.1447, acc: 94.5564, loss_bbox: 0.1954, loss_mask: 0.2141, loss: 0.6021 2024-05-30 01:42:12,211 - mmdet - INFO - Epoch [11][4500/7330] lr: 1.000e-05, eta: 1:56:48, time: 0.767, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0388, loss_cls: 0.1529, acc: 94.1687, loss_bbox: 0.2029, loss_mask: 0.2150, loss: 0.6226 2024-05-30 01:42:49,585 - mmdet - INFO - Epoch [11][4550/7330] lr: 1.000e-05, eta: 1:56:14, time: 0.747, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0386, loss_cls: 0.1497, acc: 94.3752, loss_bbox: 0.2011, loss_mask: 0.2128, loss: 0.6151 2024-05-30 01:43:22,907 - mmdet - INFO - Epoch [11][4600/7330] lr: 1.000e-05, eta: 1:55:39, time: 0.667, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0375, loss_cls: 0.1475, acc: 94.3831, loss_bbox: 0.2011, loss_mask: 0.2109, loss: 0.6099 2024-05-30 01:43:56,247 - mmdet - INFO - Epoch [11][4650/7330] lr: 1.000e-05, eta: 1:55:05, time: 0.667, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0390, loss_cls: 0.1485, acc: 94.3315, loss_bbox: 0.2009, loss_mask: 0.2172, loss: 0.6185 2024-05-30 01:44:31,884 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-30 01:44:31,884 - mmdet - INFO - Epoch [11][4700/7330] lr: 1.000e-05, eta: 1:54:30, time: 0.713, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0373, loss_cls: 0.1529, acc: 94.2021, loss_bbox: 0.2053, loss_mask: 0.2138, loss: 0.6224 2024-05-30 01:45:05,218 - mmdet - INFO - Epoch [11][4750/7330] lr: 1.000e-05, eta: 1:53:56, time: 0.667, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0372, loss_cls: 0.1433, acc: 94.5249, loss_bbox: 0.2005, loss_mask: 0.2149, loss: 0.6070 2024-05-30 01:45:38,891 - mmdet - INFO - Epoch [11][4800/7330] lr: 1.000e-05, eta: 1:53:21, time: 0.673, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0375, loss_cls: 0.1443, acc: 94.4426, loss_bbox: 0.1948, loss_mask: 0.2089, loss: 0.5970 2024-05-30 01:46:13,157 - mmdet - INFO - Epoch [11][4850/7330] lr: 1.000e-05, eta: 1:52:47, time: 0.685, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0378, loss_cls: 0.1450, acc: 94.5518, loss_bbox: 0.1935, loss_mask: 0.2069, loss: 0.5959 2024-05-30 01:46:46,437 - mmdet - INFO - Epoch [11][4900/7330] lr: 1.000e-05, eta: 1:52:12, time: 0.666, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0357, loss_cls: 0.1400, acc: 94.6521, loss_bbox: 0.1928, loss_mask: 0.2102, loss: 0.5902 2024-05-30 01:47:21,165 - mmdet - INFO - Epoch [11][4950/7330] lr: 1.000e-05, eta: 1:51:38, time: 0.695, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0392, loss_cls: 0.1492, acc: 94.3052, loss_bbox: 0.2016, loss_mask: 0.2079, loss: 0.6102 2024-05-30 01:47:55,345 - mmdet - INFO - Epoch [11][5000/7330] lr: 1.000e-05, eta: 1:51:03, time: 0.684, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0376, loss_cls: 0.1483, acc: 94.3098, loss_bbox: 0.2061, loss_mask: 0.2105, loss: 0.6143 2024-05-30 01:48:29,220 - mmdet - INFO - Epoch [11][5050/7330] lr: 1.000e-05, eta: 1:50:29, time: 0.678, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0387, loss_cls: 0.1509, acc: 94.2617, loss_bbox: 0.2049, loss_mask: 0.2143, loss: 0.6222 2024-05-30 01:49:02,333 - mmdet - INFO - Epoch [11][5100/7330] lr: 1.000e-05, eta: 1:49:54, time: 0.662, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0364, loss_cls: 0.1467, acc: 94.4949, loss_bbox: 0.1961, loss_mask: 0.2125, loss: 0.6034 2024-05-30 01:49:35,639 - mmdet - INFO - Epoch [11][5150/7330] lr: 1.000e-05, eta: 1:49:19, time: 0.666, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0386, loss_cls: 0.1472, acc: 94.4055, loss_bbox: 0.2025, loss_mask: 0.2145, loss: 0.6137 2024-05-30 01:50:09,572 - mmdet - INFO - Epoch [11][5200/7330] lr: 1.000e-05, eta: 1:48:45, time: 0.679, data_time: 0.071, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0389, loss_cls: 0.1529, acc: 94.1230, loss_bbox: 0.2083, loss_mask: 0.2134, loss: 0.6262 2024-05-30 01:50:47,183 - mmdet - INFO - Epoch [11][5250/7330] lr: 1.000e-05, eta: 1:48:11, time: 0.752, data_time: 0.067, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0382, loss_cls: 0.1487, acc: 94.3203, loss_bbox: 0.1945, loss_mask: 0.2096, loss: 0.6037 2024-05-30 01:51:24,784 - mmdet - INFO - Epoch [11][5300/7330] lr: 1.000e-05, eta: 1:47:36, time: 0.752, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0375, loss_cls: 0.1424, acc: 94.5391, loss_bbox: 0.1920, loss_mask: 0.2100, loss: 0.5935 2024-05-30 01:52:00,504 - mmdet - INFO - Epoch [11][5350/7330] lr: 1.000e-05, eta: 1:47:02, time: 0.714, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0369, loss_cls: 0.1484, acc: 94.3916, loss_bbox: 0.2031, loss_mask: 0.2154, loss: 0.6172 2024-05-30 01:52:38,721 - mmdet - INFO - Epoch [11][5400/7330] lr: 1.000e-05, eta: 1:46:28, time: 0.764, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0381, loss_cls: 0.1489, acc: 94.3474, loss_bbox: 0.2003, loss_mask: 0.2135, loss: 0.6134 2024-05-30 01:53:14,010 - mmdet - INFO - Epoch [11][5450/7330] lr: 1.000e-05, eta: 1:45:54, time: 0.706, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0387, loss_cls: 0.1524, acc: 94.1875, loss_bbox: 0.2081, loss_mask: 0.2152, loss: 0.6277 2024-05-30 01:53:47,531 - mmdet - INFO - Epoch [11][5500/7330] lr: 1.000e-05, eta: 1:45:19, time: 0.670, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0400, loss_cls: 0.1503, acc: 94.2620, loss_bbox: 0.2016, loss_mask: 0.2151, loss: 0.6198 2024-05-30 01:54:20,646 - mmdet - INFO - Epoch [11][5550/7330] lr: 1.000e-05, eta: 1:44:44, time: 0.663, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0379, loss_cls: 0.1487, acc: 94.3232, loss_bbox: 0.2068, loss_mask: 0.2189, loss: 0.6252 2024-05-30 01:54:56,621 - mmdet - INFO - Epoch [11][5600/7330] lr: 1.000e-05, eta: 1:44:10, time: 0.719, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0385, loss_cls: 0.1472, acc: 94.3296, loss_bbox: 0.2000, loss_mask: 0.2133, loss: 0.6112 2024-05-30 01:55:29,514 - mmdet - INFO - Epoch [11][5650/7330] lr: 1.000e-05, eta: 1:43:35, time: 0.658, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0375, loss_cls: 0.1410, acc: 94.6377, loss_bbox: 0.1911, loss_mask: 0.2064, loss: 0.5885 2024-05-30 01:56:02,935 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-30 01:56:02,935 - mmdet - INFO - Epoch [11][5700/7330] lr: 1.000e-05, eta: 1:43:01, time: 0.668, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0395, loss_cls: 0.1501, acc: 94.2139, loss_bbox: 0.2041, loss_mask: 0.2206, loss: 0.6267 2024-05-30 01:56:36,196 - mmdet - INFO - Epoch [11][5750/7330] lr: 1.000e-05, eta: 1:42:26, time: 0.665, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0375, loss_cls: 0.1475, acc: 94.3152, loss_bbox: 0.2035, loss_mask: 0.2152, loss: 0.6164 2024-05-30 01:57:09,991 - mmdet - INFO - Epoch [11][5800/7330] lr: 1.000e-05, eta: 1:41:52, time: 0.676, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0368, loss_cls: 0.1426, acc: 94.5569, loss_bbox: 0.1950, loss_mask: 0.2103, loss: 0.5961 2024-05-30 01:57:44,033 - mmdet - INFO - Epoch [11][5850/7330] lr: 1.000e-05, eta: 1:41:17, time: 0.681, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0395, loss_cls: 0.1514, acc: 94.2283, loss_bbox: 0.2034, loss_mask: 0.2205, loss: 0.6283 2024-05-30 01:58:18,051 - mmdet - INFO - Epoch [11][5900/7330] lr: 1.000e-05, eta: 1:40:42, time: 0.680, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0365, loss_cls: 0.1438, acc: 94.4680, loss_bbox: 0.1965, loss_mask: 0.2089, loss: 0.5979 2024-05-30 01:58:52,491 - mmdet - INFO - Epoch [11][5950/7330] lr: 1.000e-05, eta: 1:40:08, time: 0.689, data_time: 0.075, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0400, loss_cls: 0.1500, acc: 94.3333, loss_bbox: 0.2038, loss_mask: 0.2150, loss: 0.6220 2024-05-30 01:59:25,919 - mmdet - INFO - Epoch [11][6000/7330] lr: 1.000e-05, eta: 1:39:33, time: 0.669, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0366, loss_cls: 0.1386, acc: 94.6123, loss_bbox: 0.1952, loss_mask: 0.2081, loss: 0.5892 2024-05-30 01:59:59,478 - mmdet - INFO - Epoch [11][6050/7330] lr: 1.000e-05, eta: 1:38:59, time: 0.671, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0384, loss_cls: 0.1502, acc: 94.2947, loss_bbox: 0.2050, loss_mask: 0.2163, loss: 0.6228 2024-05-30 02:00:33,321 - mmdet - INFO - Epoch [11][6100/7330] lr: 1.000e-05, eta: 1:38:24, time: 0.677, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0387, loss_cls: 0.1488, acc: 94.4226, loss_bbox: 0.1981, loss_mask: 0.2123, loss: 0.6113 2024-05-30 02:01:12,443 - mmdet - INFO - Epoch [11][6150/7330] lr: 1.000e-05, eta: 1:37:50, time: 0.782, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0379, loss_cls: 0.1481, acc: 94.3667, loss_bbox: 0.2007, loss_mask: 0.2136, loss: 0.6141 2024-05-30 02:01:48,399 - mmdet - INFO - Epoch [11][6200/7330] lr: 1.000e-05, eta: 1:37:16, time: 0.719, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0381, loss_cls: 0.1495, acc: 94.3525, loss_bbox: 0.2061, loss_mask: 0.2171, loss: 0.6242 2024-05-30 02:02:26,909 - mmdet - INFO - Epoch [11][6250/7330] lr: 1.000e-05, eta: 1:36:42, time: 0.770, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0388, loss_cls: 0.1541, acc: 94.1492, loss_bbox: 0.2060, loss_mask: 0.2155, loss: 0.6279 2024-05-30 02:03:05,289 - mmdet - INFO - Epoch [11][6300/7330] lr: 1.000e-05, eta: 1:36:08, time: 0.767, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0374, loss_cls: 0.1454, acc: 94.4583, loss_bbox: 0.2008, loss_mask: 0.2186, loss: 0.6147 2024-05-30 02:03:38,632 - mmdet - INFO - Epoch [11][6350/7330] lr: 1.000e-05, eta: 1:35:33, time: 0.667, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0376, loss_cls: 0.1479, acc: 94.3071, loss_bbox: 0.2038, loss_mask: 0.2186, loss: 0.6213 2024-05-30 02:04:12,027 - mmdet - INFO - Epoch [11][6400/7330] lr: 1.000e-05, eta: 1:34:58, time: 0.668, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0373, loss_cls: 0.1411, acc: 94.5454, loss_bbox: 0.1952, loss_mask: 0.2109, loss: 0.5959 2024-05-30 02:04:48,746 - mmdet - INFO - Epoch [11][6450/7330] lr: 1.000e-05, eta: 1:34:24, time: 0.734, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0383, loss_cls: 0.1505, acc: 94.3074, loss_bbox: 0.2055, loss_mask: 0.2187, loss: 0.6263 2024-05-30 02:05:22,147 - mmdet - INFO - Epoch [11][6500/7330] lr: 1.000e-05, eta: 1:33:50, time: 0.668, data_time: 0.063, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0368, loss_cls: 0.1418, acc: 94.5518, loss_bbox: 0.1983, loss_mask: 0.2144, loss: 0.6031 2024-05-30 02:05:55,847 - mmdet - INFO - Epoch [11][6550/7330] lr: 1.000e-05, eta: 1:33:15, time: 0.674, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0379, loss_cls: 0.1521, acc: 94.1931, loss_bbox: 0.2036, loss_mask: 0.2131, loss: 0.6188 2024-05-30 02:06:29,325 - mmdet - INFO - Epoch [11][6600/7330] lr: 1.000e-05, eta: 1:32:40, time: 0.670, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0364, loss_cls: 0.1435, acc: 94.6157, loss_bbox: 0.1920, loss_mask: 0.2071, loss: 0.5905 2024-05-30 02:07:02,717 - mmdet - INFO - Epoch [11][6650/7330] lr: 1.000e-05, eta: 1:32:06, time: 0.668, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0365, loss_cls: 0.1453, acc: 94.4783, loss_bbox: 0.1984, loss_mask: 0.2137, loss: 0.6060 2024-05-30 02:07:36,673 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-30 02:07:36,673 - mmdet - INFO - Epoch [11][6700/7330] lr: 1.000e-05, eta: 1:31:31, time: 0.679, data_time: 0.063, memory: 11628, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0378, loss_cls: 0.1471, acc: 94.3555, loss_bbox: 0.2005, loss_mask: 0.2129, loss: 0.6093 2024-05-30 02:08:09,932 - mmdet - INFO - Epoch [11][6750/7330] lr: 1.000e-05, eta: 1:30:57, time: 0.665, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0369, loss_cls: 0.1490, acc: 94.3347, loss_bbox: 0.2010, loss_mask: 0.2151, loss: 0.6136 2024-05-30 02:08:43,079 - mmdet - INFO - Epoch [11][6800/7330] lr: 1.000e-05, eta: 1:30:22, time: 0.663, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0383, loss_cls: 0.1454, acc: 94.4629, loss_bbox: 0.1991, loss_mask: 0.2116, loss: 0.6082 2024-05-30 02:09:16,263 - mmdet - INFO - Epoch [11][6850/7330] lr: 1.000e-05, eta: 1:29:47, time: 0.664, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0370, loss_cls: 0.1429, acc: 94.5557, loss_bbox: 0.1943, loss_mask: 0.2061, loss: 0.5918 2024-05-30 02:09:49,258 - mmdet - INFO - Epoch [11][6900/7330] lr: 1.000e-05, eta: 1:29:13, time: 0.660, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0369, loss_cls: 0.1426, acc: 94.5920, loss_bbox: 0.1907, loss_mask: 0.2062, loss: 0.5871 2024-05-30 02:10:23,240 - mmdet - INFO - Epoch [11][6950/7330] lr: 1.000e-05, eta: 1:28:38, time: 0.680, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0415, loss_cls: 0.1547, acc: 94.0630, loss_bbox: 0.2101, loss_mask: 0.2180, loss: 0.6375 2024-05-30 02:10:59,666 - mmdet - INFO - Epoch [11][7000/7330] lr: 1.000e-05, eta: 1:28:04, time: 0.729, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0390, loss_cls: 0.1502, acc: 94.3474, loss_bbox: 0.2002, loss_mask: 0.2158, loss: 0.6188 2024-05-30 02:11:38,700 - mmdet - INFO - Epoch [11][7050/7330] lr: 1.000e-05, eta: 1:27:30, time: 0.781, data_time: 0.069, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0388, loss_cls: 0.1477, acc: 94.3552, loss_bbox: 0.2059, loss_mask: 0.2140, loss: 0.6186 2024-05-30 02:12:13,889 - mmdet - INFO - Epoch [11][7100/7330] lr: 1.000e-05, eta: 1:26:55, time: 0.704, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0381, loss_cls: 0.1426, acc: 94.5405, loss_bbox: 0.1959, loss_mask: 0.2120, loss: 0.6014 2024-05-30 02:12:50,211 - mmdet - INFO - Epoch [11][7150/7330] lr: 1.000e-05, eta: 1:26:21, time: 0.726, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0405, loss_cls: 0.1526, acc: 94.2139, loss_bbox: 0.2064, loss_mask: 0.2158, loss: 0.6287 2024-05-30 02:13:27,398 - mmdet - INFO - Epoch [11][7200/7330] lr: 1.000e-05, eta: 1:25:47, time: 0.744, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0351, loss_cls: 0.1402, acc: 94.6658, loss_bbox: 0.1890, loss_mask: 0.2061, loss: 0.5813 2024-05-30 02:14:00,560 - mmdet - INFO - Epoch [11][7250/7330] lr: 1.000e-05, eta: 1:25:12, time: 0.663, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0357, loss_cls: 0.1379, acc: 94.8599, loss_bbox: 0.1875, loss_mask: 0.2107, loss: 0.5829 2024-05-30 02:14:33,566 - mmdet - INFO - Epoch [11][7300/7330] lr: 1.000e-05, eta: 1:24:38, time: 0.660, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0390, loss_cls: 0.1510, acc: 94.2493, loss_bbox: 0.2049, loss_mask: 0.2140, loss: 0.6219 2024-05-30 02:14:56,435 - mmdet - INFO - Saving checkpoint at 11 epochs 2024-05-30 02:16:45,202 - mmdet - INFO - Evaluating bbox... 2024-05-30 02:17:06,779 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.463 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.683 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.505 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.270 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.508 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.629 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.579 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.579 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.579 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.379 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.628 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.751 2024-05-30 02:17:06,779 - mmdet - INFO - Evaluating segm... 2024-05-30 02:17:33,073 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.412 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.648 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.443 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.191 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.449 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.625 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.518 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.518 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.518 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.304 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.566 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.707 2024-05-30 02:17:33,400 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-30 02:17:33,402 - mmdet - INFO - Epoch(val) [11][625] bbox_mAP: 0.4630, bbox_mAP_50: 0.6830, bbox_mAP_75: 0.5050, bbox_mAP_s: 0.2700, bbox_mAP_m: 0.5080, bbox_mAP_l: 0.6290, bbox_mAP_copypaste: 0.463 0.683 0.505 0.270 0.508 0.629, segm_mAP: 0.4120, segm_mAP_50: 0.6480, segm_mAP_75: 0.4430, segm_mAP_s: 0.1910, segm_mAP_m: 0.4490, segm_mAP_l: 0.6250, segm_mAP_copypaste: 0.412 0.648 0.443 0.191 0.449 0.625 2024-05-30 02:18:15,235 - mmdet - INFO - Epoch [12][50/7330] lr: 1.000e-06, eta: 1:23:41, time: 0.836, data_time: 0.125, memory: 11628, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0393, loss_cls: 0.1511, acc: 94.3901, loss_bbox: 0.2032, loss_mask: 0.2161, loss: 0.6230 2024-05-30 02:18:54,593 - mmdet - INFO - Epoch [12][100/7330] lr: 1.000e-06, eta: 1:23:07, time: 0.787, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0377, loss_cls: 0.1457, acc: 94.4243, loss_bbox: 0.1958, loss_mask: 0.2113, loss: 0.6031 2024-05-30 02:19:28,414 - mmdet - INFO - Epoch [12][150/7330] lr: 1.000e-06, eta: 1:22:33, time: 0.676, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0386, loss_cls: 0.1435, acc: 94.4983, loss_bbox: 0.1959, loss_mask: 0.2075, loss: 0.5984 2024-05-30 02:20:02,514 - mmdet - INFO - Epoch [12][200/7330] lr: 1.000e-06, eta: 1:21:58, time: 0.682, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0381, loss_cls: 0.1435, acc: 94.4670, loss_bbox: 0.1952, loss_mask: 0.2072, loss: 0.5961 2024-05-30 02:20:36,164 - mmdet - INFO - Epoch [12][250/7330] lr: 1.000e-06, eta: 1:21:23, time: 0.673, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0374, loss_cls: 0.1428, acc: 94.5630, loss_bbox: 0.1964, loss_mask: 0.2095, loss: 0.5969 2024-05-30 02:21:09,877 - mmdet - INFO - Epoch [12][300/7330] lr: 1.000e-06, eta: 1:20:49, time: 0.674, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0385, loss_cls: 0.1443, acc: 94.5281, loss_bbox: 0.1995, loss_mask: 0.2126, loss: 0.6068 2024-05-30 02:21:43,424 - mmdet - INFO - Epoch [12][350/7330] lr: 1.000e-06, eta: 1:20:14, time: 0.671, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0362, loss_cls: 0.1433, acc: 94.5757, loss_bbox: 0.1947, loss_mask: 0.2080, loss: 0.5938 2024-05-30 02:22:16,430 - mmdet - INFO - Epoch [12][400/7330] lr: 1.000e-06, eta: 1:19:40, time: 0.660, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0369, loss_cls: 0.1418, acc: 94.6516, loss_bbox: 0.1955, loss_mask: 0.2087, loss: 0.5952 2024-05-30 02:22:50,454 - mmdet - INFO - Epoch [12][450/7330] lr: 1.000e-06, eta: 1:19:05, time: 0.681, data_time: 0.069, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0382, loss_cls: 0.1539, acc: 94.1814, loss_bbox: 0.2076, loss_mask: 0.2154, loss: 0.6281 2024-05-30 02:23:23,566 - mmdet - INFO - Epoch [12][500/7330] lr: 1.000e-06, eta: 1:18:31, time: 0.662, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0343, loss_cls: 0.1338, acc: 94.8833, loss_bbox: 0.1848, loss_mask: 0.2066, loss: 0.5701 2024-05-30 02:23:56,969 - mmdet - INFO - Epoch [12][550/7330] lr: 1.000e-06, eta: 1:17:56, time: 0.668, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0367, loss_cls: 0.1407, acc: 94.6709, loss_bbox: 0.1914, loss_mask: 0.2071, loss: 0.5872 2024-05-30 02:24:30,844 - mmdet - INFO - Epoch [12][600/7330] lr: 1.000e-06, eta: 1:17:21, time: 0.677, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0357, loss_cls: 0.1424, acc: 94.5374, loss_bbox: 0.1955, loss_mask: 0.2100, loss: 0.5938 2024-05-30 02:25:04,486 - mmdet - INFO - Epoch [12][650/7330] lr: 1.000e-06, eta: 1:16:47, time: 0.673, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0361, loss_cls: 0.1424, acc: 94.5071, loss_bbox: 0.1984, loss_mask: 0.2097, loss: 0.5977 2024-05-30 02:25:38,157 - mmdet - INFO - Epoch [12][700/7330] lr: 1.000e-06, eta: 1:16:12, time: 0.673, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0366, loss_cls: 0.1418, acc: 94.5339, loss_bbox: 0.1972, loss_mask: 0.2106, loss: 0.5990 2024-05-30 02:26:12,067 - mmdet - INFO - Epoch [12][750/7330] lr: 1.000e-06, eta: 1:15:38, time: 0.678, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0362, loss_cls: 0.1455, acc: 94.3916, loss_bbox: 0.1982, loss_mask: 0.2139, loss: 0.6062 2024-05-30 02:26:45,555 - mmdet - INFO - Epoch [12][800/7330] lr: 1.000e-06, eta: 1:15:03, time: 0.670, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0350, loss_cls: 0.1378, acc: 94.7310, loss_bbox: 0.1905, loss_mask: 0.2103, loss: 0.5845 2024-05-30 02:27:19,334 - mmdet - INFO - Epoch [12][850/7330] lr: 1.000e-06, eta: 1:14:29, time: 0.676, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0356, loss_cls: 0.1363, acc: 94.7661, loss_bbox: 0.1888, loss_mask: 0.2090, loss: 0.5813 2024-05-30 02:27:57,731 - mmdet - INFO - Epoch [12][900/7330] lr: 1.000e-06, eta: 1:13:55, time: 0.768, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0396, loss_cls: 0.1475, acc: 94.3733, loss_bbox: 0.2037, loss_mask: 0.2149, loss: 0.6181 2024-05-30 02:28:34,640 - mmdet - INFO - Epoch [12][950/7330] lr: 1.000e-06, eta: 1:13:20, time: 0.738, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0372, loss_cls: 0.1364, acc: 94.8091, loss_bbox: 0.1927, loss_mask: 0.2110, loss: 0.5894 2024-05-30 02:29:12,220 - mmdet - INFO - Epoch [12][1000/7330] lr: 1.000e-06, eta: 1:12:46, time: 0.752, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0374, loss_cls: 0.1429, acc: 94.5647, loss_bbox: 0.1973, loss_mask: 0.2101, loss: 0.5996 2024-05-30 02:29:50,031 - mmdet - INFO - Epoch [12][1050/7330] lr: 1.000e-06, eta: 1:12:12, time: 0.756, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0392, loss_cls: 0.1414, acc: 94.5508, loss_bbox: 0.1960, loss_mask: 0.2090, loss: 0.5986 2024-05-30 02:30:23,899 - mmdet - INFO - Epoch [12][1100/7330] lr: 1.000e-06, eta: 1:11:37, time: 0.677, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0376, loss_cls: 0.1454, acc: 94.4863, loss_bbox: 0.2004, loss_mask: 0.2116, loss: 0.6066 2024-05-30 02:30:57,376 - mmdet - INFO - Epoch [12][1150/7330] lr: 1.000e-06, eta: 1:11:03, time: 0.670, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0399, loss_cls: 0.1440, acc: 94.5232, loss_bbox: 0.1983, loss_mask: 0.2070, loss: 0.6024 2024-05-30 02:31:34,221 - mmdet - INFO - Epoch [12][1200/7330] lr: 1.000e-06, eta: 1:10:28, time: 0.737, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0371, loss_cls: 0.1461, acc: 94.3767, loss_bbox: 0.2024, loss_mask: 0.2098, loss: 0.6075 2024-05-30 02:32:07,941 - mmdet - INFO - Epoch [12][1250/7330] lr: 1.000e-06, eta: 1:09:54, time: 0.674, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0373, loss_cls: 0.1479, acc: 94.4319, loss_bbox: 0.1972, loss_mask: 0.2054, loss: 0.6001 2024-05-30 02:32:42,309 - mmdet - INFO - Epoch [12][1300/7330] lr: 1.000e-06, eta: 1:09:19, time: 0.687, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0375, loss_cls: 0.1475, acc: 94.3296, loss_bbox: 0.2015, loss_mask: 0.2134, loss: 0.6123 2024-05-30 02:33:16,392 - mmdet - INFO - Epoch [12][1350/7330] lr: 1.000e-06, eta: 1:08:45, time: 0.682, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0376, loss_cls: 0.1496, acc: 94.2583, loss_bbox: 0.2017, loss_mask: 0.2140, loss: 0.6148 2024-05-30 02:33:49,908 - mmdet - INFO - Epoch [12][1400/7330] lr: 1.000e-06, eta: 1:08:10, time: 0.670, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0356, loss_cls: 0.1343, acc: 94.8943, loss_bbox: 0.1849, loss_mask: 0.2011, loss: 0.5673 2024-05-30 02:34:24,094 - mmdet - INFO - Epoch [12][1450/7330] lr: 1.000e-06, eta: 1:07:36, time: 0.684, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0400, loss_cls: 0.1468, acc: 94.4082, loss_bbox: 0.2052, loss_mask: 0.2185, loss: 0.6232 2024-05-30 02:34:57,337 - mmdet - INFO - Epoch [12][1500/7330] lr: 1.000e-06, eta: 1:07:01, time: 0.665, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0355, loss_cls: 0.1386, acc: 94.7561, loss_bbox: 0.1909, loss_mask: 0.2083, loss: 0.5850 2024-05-30 02:35:31,012 - mmdet - INFO - Epoch [12][1550/7330] lr: 1.000e-06, eta: 1:06:27, time: 0.673, data_time: 0.065, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0376, loss_cls: 0.1433, acc: 94.4878, loss_bbox: 0.1968, loss_mask: 0.2050, loss: 0.5946 2024-05-30 02:36:05,426 - mmdet - INFO - Epoch [12][1600/7330] lr: 1.000e-06, eta: 1:05:52, time: 0.688, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0418, loss_cls: 0.1546, acc: 94.0425, loss_bbox: 0.2135, loss_mask: 0.2167, loss: 0.6393 2024-05-30 02:36:38,996 - mmdet - INFO - Epoch [12][1650/7330] lr: 1.000e-06, eta: 1:05:18, time: 0.671, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0390, loss_cls: 0.1541, acc: 94.1543, loss_bbox: 0.2098, loss_mask: 0.2152, loss: 0.6302 2024-05-30 02:37:12,806 - mmdet - INFO - Epoch [12][1700/7330] lr: 1.000e-06, eta: 1:04:43, time: 0.676, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0377, loss_cls: 0.1436, acc: 94.5754, loss_bbox: 0.1963, loss_mask: 0.2142, loss: 0.6041 2024-05-30 02:37:48,979 - mmdet - INFO - Epoch [12][1750/7330] lr: 1.000e-06, eta: 1:04:09, time: 0.724, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0378, loss_cls: 0.1407, acc: 94.5715, loss_bbox: 0.1956, loss_mask: 0.2109, loss: 0.5966 2024-05-30 02:38:26,542 - mmdet - INFO - Epoch [12][1800/7330] lr: 1.000e-06, eta: 1:03:34, time: 0.751, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0381, loss_cls: 0.1471, acc: 94.3794, loss_bbox: 0.1983, loss_mask: 0.2073, loss: 0.6030 2024-05-30 02:39:08,099 - mmdet - INFO - Epoch [12][1850/7330] lr: 1.000e-06, eta: 1:03:00, time: 0.831, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0357, loss_cls: 0.1453, acc: 94.4634, loss_bbox: 0.1935, loss_mask: 0.2089, loss: 0.5947 2024-05-30 02:39:44,080 - mmdet - INFO - Epoch [12][1900/7330] lr: 1.000e-06, eta: 1:02:26, time: 0.720, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0396, loss_cls: 0.1466, acc: 94.4111, loss_bbox: 0.2016, loss_mask: 0.2142, loss: 0.6137 2024-05-30 02:40:20,092 - mmdet - INFO - Epoch [12][1950/7330] lr: 1.000e-06, eta: 1:01:51, time: 0.720, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0367, loss_cls: 0.1449, acc: 94.3960, loss_bbox: 0.2008, loss_mask: 0.2108, loss: 0.6056 2024-05-30 02:40:53,617 - mmdet - INFO - Epoch [12][2000/7330] lr: 1.000e-06, eta: 1:01:17, time: 0.671, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0406, loss_cls: 0.1503, acc: 94.1643, loss_bbox: 0.2074, loss_mask: 0.2175, loss: 0.6273 2024-05-30 02:41:27,225 - mmdet - INFO - Epoch [12][2050/7330] lr: 1.000e-06, eta: 1:00:42, time: 0.672, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0374, loss_cls: 0.1456, acc: 94.4180, loss_bbox: 0.1975, loss_mask: 0.2127, loss: 0.6047 2024-05-30 02:42:04,069 - mmdet - INFO - Epoch [12][2100/7330] lr: 1.000e-06, eta: 1:00:08, time: 0.737, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0383, loss_cls: 0.1470, acc: 94.4194, loss_bbox: 0.1988, loss_mask: 0.2090, loss: 0.6045 2024-05-30 02:42:37,529 - mmdet - INFO - Epoch [12][2150/7330] lr: 1.000e-06, eta: 0:59:33, time: 0.669, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0377, loss_cls: 0.1379, acc: 94.6753, loss_bbox: 0.1975, loss_mask: 0.2086, loss: 0.5924 2024-05-30 02:43:11,438 - mmdet - INFO - Epoch [12][2200/7330] lr: 1.000e-06, eta: 0:58:59, time: 0.678, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0379, loss_cls: 0.1488, acc: 94.3977, loss_bbox: 0.2036, loss_mask: 0.2114, loss: 0.6135 2024-05-30 02:43:45,405 - mmdet - INFO - Epoch [12][2250/7330] lr: 1.000e-06, eta: 0:58:24, time: 0.679, data_time: 0.057, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0393, loss_cls: 0.1467, acc: 94.3882, loss_bbox: 0.1991, loss_mask: 0.2072, loss: 0.6054 2024-05-30 02:44:19,087 - mmdet - INFO - Epoch [12][2300/7330] lr: 1.000e-06, eta: 0:57:50, time: 0.674, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0392, loss_cls: 0.1463, acc: 94.3765, loss_bbox: 0.2018, loss_mask: 0.2121, loss: 0.6121 2024-05-30 02:44:52,474 - mmdet - INFO - Epoch [12][2350/7330] lr: 1.000e-06, eta: 0:57:15, time: 0.668, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0360, loss_cls: 0.1402, acc: 94.5732, loss_bbox: 0.1950, loss_mask: 0.2117, loss: 0.5942 2024-05-30 02:45:26,489 - mmdet - INFO - Epoch [12][2400/7330] lr: 1.000e-06, eta: 0:56:41, time: 0.680, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0383, loss_cls: 0.1483, acc: 94.3948, loss_bbox: 0.2003, loss_mask: 0.2123, loss: 0.6111 2024-05-30 02:46:00,043 - mmdet - INFO - Epoch [12][2450/7330] lr: 1.000e-06, eta: 0:56:06, time: 0.671, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0391, loss_cls: 0.1523, acc: 94.1340, loss_bbox: 0.2074, loss_mask: 0.2143, loss: 0.6259 2024-05-30 02:46:33,231 - mmdet - INFO - Epoch [12][2500/7330] lr: 1.000e-06, eta: 0:55:32, time: 0.664, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0375, loss_cls: 0.1446, acc: 94.4214, loss_bbox: 0.1997, loss_mask: 0.2090, loss: 0.6023 2024-05-30 02:47:06,715 - mmdet - INFO - Epoch [12][2550/7330] lr: 1.000e-06, eta: 0:54:57, time: 0.670, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0365, loss_cls: 0.1480, acc: 94.3601, loss_bbox: 0.2016, loss_mask: 0.2155, loss: 0.6133 2024-05-30 02:47:40,048 - mmdet - INFO - Epoch [12][2600/7330] lr: 1.000e-06, eta: 0:54:23, time: 0.667, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0369, loss_cls: 0.1441, acc: 94.5366, loss_bbox: 0.2007, loss_mask: 0.2099, loss: 0.6030 2024-05-30 02:48:18,766 - mmdet - INFO - Epoch [12][2650/7330] lr: 1.000e-06, eta: 0:53:48, time: 0.774, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0364, loss_cls: 0.1440, acc: 94.5212, loss_bbox: 0.1953, loss_mask: 0.2097, loss: 0.5970 2024-05-30 02:48:55,187 - mmdet - INFO - Epoch [12][2700/7330] lr: 1.000e-06, eta: 0:53:14, time: 0.729, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0399, loss_cls: 0.1500, acc: 94.2749, loss_bbox: 0.1983, loss_mask: 0.2118, loss: 0.6123 2024-05-30 02:49:34,745 - mmdet - INFO - Epoch [12][2750/7330] lr: 1.000e-06, eta: 0:52:40, time: 0.791, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0361, loss_cls: 0.1421, acc: 94.5876, loss_bbox: 0.1894, loss_mask: 0.2085, loss: 0.5873 2024-05-30 02:50:13,753 - mmdet - INFO - Epoch [12][2800/7330] lr: 1.000e-06, eta: 0:52:05, time: 0.780, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0393, loss_cls: 0.1472, acc: 94.4285, loss_bbox: 0.2018, loss_mask: 0.2119, loss: 0.6127 2024-05-30 02:50:47,089 - mmdet - INFO - Epoch [12][2850/7330] lr: 1.000e-06, eta: 0:51:31, time: 0.667, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0343, loss_cls: 0.1354, acc: 94.7810, loss_bbox: 0.1877, loss_mask: 0.2075, loss: 0.5765 2024-05-30 02:51:20,724 - mmdet - INFO - Epoch [12][2900/7330] lr: 1.000e-06, eta: 0:50:56, time: 0.673, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0368, loss_cls: 0.1472, acc: 94.3496, loss_bbox: 0.2020, loss_mask: 0.2140, loss: 0.6121 2024-05-30 02:51:57,362 - mmdet - INFO - Epoch [12][2950/7330] lr: 1.000e-06, eta: 0:50:22, time: 0.733, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0396, loss_cls: 0.1478, acc: 94.3276, loss_bbox: 0.2031, loss_mask: 0.2147, loss: 0.6184 2024-05-30 02:52:30,532 - mmdet - INFO - Epoch [12][3000/7330] lr: 1.000e-06, eta: 0:49:47, time: 0.664, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0362, loss_cls: 0.1430, acc: 94.5557, loss_bbox: 0.1936, loss_mask: 0.2152, loss: 0.5999 2024-05-30 02:53:03,725 - mmdet - INFO - Epoch [12][3050/7330] lr: 1.000e-06, eta: 0:49:13, time: 0.664, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0363, loss_cls: 0.1452, acc: 94.4871, loss_bbox: 0.1965, loss_mask: 0.2083, loss: 0.5978 2024-05-30 02:53:37,238 - mmdet - INFO - Epoch [12][3100/7330] lr: 1.000e-06, eta: 0:48:38, time: 0.670, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0400, loss_cls: 0.1457, acc: 94.3984, loss_bbox: 0.1985, loss_mask: 0.2116, loss: 0.6087 2024-05-30 02:54:10,988 - mmdet - INFO - Epoch [12][3150/7330] lr: 1.000e-06, eta: 0:48:04, time: 0.675, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0368, loss_cls: 0.1390, acc: 94.6802, loss_bbox: 0.1902, loss_mask: 0.2049, loss: 0.5825 2024-05-30 02:54:44,777 - mmdet - INFO - Epoch [12][3200/7330] lr: 1.000e-06, eta: 0:47:29, time: 0.676, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0381, loss_cls: 0.1444, acc: 94.5112, loss_bbox: 0.2019, loss_mask: 0.2089, loss: 0.6049 2024-05-30 02:55:18,271 - mmdet - INFO - Epoch [12][3250/7330] lr: 1.000e-06, eta: 0:46:55, time: 0.670, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0366, loss_cls: 0.1411, acc: 94.5813, loss_bbox: 0.1933, loss_mask: 0.2074, loss: 0.5901 2024-05-30 02:55:51,467 - mmdet - INFO - Epoch [12][3300/7330] lr: 1.000e-06, eta: 0:46:20, time: 0.664, data_time: 0.041, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0361, loss_cls: 0.1451, acc: 94.5232, loss_bbox: 0.1949, loss_mask: 0.2095, loss: 0.5978 2024-05-30 02:56:24,167 - mmdet - INFO - Epoch [12][3350/7330] lr: 1.000e-06, eta: 0:45:45, time: 0.654, data_time: 0.039, memory: 11628, loss_rpn_cls: 0.0093, loss_rpn_bbox: 0.0357, loss_cls: 0.1283, acc: 95.0996, loss_bbox: 0.1790, loss_mask: 0.2040, loss: 0.5563 2024-05-30 02:56:57,492 - mmdet - INFO - Epoch [12][3400/7330] lr: 1.000e-06, eta: 0:45:11, time: 0.666, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0389, loss_cls: 0.1459, acc: 94.4792, loss_bbox: 0.1971, loss_mask: 0.2127, loss: 0.6072 2024-05-30 02:57:31,012 - mmdet - INFO - Epoch [12][3450/7330] lr: 1.000e-06, eta: 0:44:36, time: 0.671, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0389, loss_cls: 0.1427, acc: 94.5498, loss_bbox: 0.1959, loss_mask: 0.2103, loss: 0.5996 2024-05-30 02:58:04,977 - mmdet - INFO - Epoch [12][3500/7330] lr: 1.000e-06, eta: 0:44:02, time: 0.679, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0391, loss_cls: 0.1482, acc: 94.3748, loss_bbox: 0.2009, loss_mask: 0.2154, loss: 0.6163 2024-05-30 02:58:43,620 - mmdet - INFO - Epoch [12][3550/7330] lr: 1.000e-06, eta: 0:43:28, time: 0.773, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0379, loss_cls: 0.1396, acc: 94.6201, loss_bbox: 0.1979, loss_mask: 0.2101, loss: 0.5961 2024-05-30 02:59:24,827 - mmdet - INFO - Epoch [12][3600/7330] lr: 1.000e-06, eta: 0:42:53, time: 0.824, data_time: 0.064, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0390, loss_cls: 0.1475, acc: 94.3538, loss_bbox: 0.2016, loss_mask: 0.2123, loss: 0.6130 2024-05-30 03:00:00,068 - mmdet - INFO - Epoch [12][3650/7330] lr: 1.000e-06, eta: 0:42:19, time: 0.705, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0366, loss_cls: 0.1417, acc: 94.5725, loss_bbox: 0.1970, loss_mask: 0.2123, loss: 0.5993 2024-05-30 03:00:35,488 - mmdet - INFO - Epoch [12][3700/7330] lr: 1.000e-06, eta: 0:41:44, time: 0.708, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0358, loss_cls: 0.1390, acc: 94.7654, loss_bbox: 0.1896, loss_mask: 0.2073, loss: 0.5829 2024-05-30 03:01:08,980 - mmdet - INFO - Epoch [12][3750/7330] lr: 1.000e-06, eta: 0:41:10, time: 0.670, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0374, loss_cls: 0.1448, acc: 94.5198, loss_bbox: 0.1983, loss_mask: 0.2143, loss: 0.6078 2024-05-30 03:01:42,799 - mmdet - INFO - Epoch [12][3800/7330] lr: 1.000e-06, eta: 0:40:35, time: 0.676, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0384, loss_cls: 0.1447, acc: 94.4280, loss_bbox: 0.1935, loss_mask: 0.2143, loss: 0.6029 2024-05-30 03:02:18,717 - mmdet - INFO - Epoch [12][3850/7330] lr: 1.000e-06, eta: 0:40:01, time: 0.718, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0380, loss_cls: 0.1431, acc: 94.5366, loss_bbox: 0.1985, loss_mask: 0.2142, loss: 0.6057 2024-05-30 03:02:51,635 - mmdet - INFO - Epoch [12][3900/7330] lr: 1.000e-06, eta: 0:39:26, time: 0.658, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0348, loss_cls: 0.1379, acc: 94.7495, loss_bbox: 0.1855, loss_mask: 0.2103, loss: 0.5803 2024-05-30 03:03:24,932 - mmdet - INFO - Epoch [12][3950/7330] lr: 1.000e-06, eta: 0:38:52, time: 0.666, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0385, loss_cls: 0.1453, acc: 94.4529, loss_bbox: 0.1984, loss_mask: 0.2114, loss: 0.6058 2024-05-30 03:03:58,578 - mmdet - INFO - Epoch [12][4000/7330] lr: 1.000e-06, eta: 0:38:17, time: 0.673, data_time: 0.055, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0360, loss_cls: 0.1405, acc: 94.6558, loss_bbox: 0.1957, loss_mask: 0.2089, loss: 0.5934 2024-05-30 03:04:32,349 - mmdet - INFO - Epoch [12][4050/7330] lr: 1.000e-06, eta: 0:37:43, time: 0.676, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0392, loss_cls: 0.1447, acc: 94.5601, loss_bbox: 0.2005, loss_mask: 0.2138, loss: 0.6114 2024-05-30 03:05:05,865 - mmdet - INFO - Epoch [12][4100/7330] lr: 1.000e-06, eta: 0:37:08, time: 0.670, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0394, loss_cls: 0.1488, acc: 94.2334, loss_bbox: 0.2031, loss_mask: 0.2112, loss: 0.6152 2024-05-30 03:05:39,843 - mmdet - INFO - Epoch [12][4150/7330] lr: 1.000e-06, eta: 0:36:34, time: 0.679, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0348, loss_cls: 0.1385, acc: 94.6792, loss_bbox: 0.1955, loss_mask: 0.2112, loss: 0.5908 2024-05-30 03:06:12,895 - mmdet - INFO - Epoch [12][4200/7330] lr: 1.000e-06, eta: 0:35:59, time: 0.661, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0345, loss_cls: 0.1325, acc: 94.9531, loss_bbox: 0.1860, loss_mask: 0.2044, loss: 0.5680 2024-05-30 03:06:46,472 - mmdet - INFO - Epoch [12][4250/7330] lr: 1.000e-06, eta: 0:35:25, time: 0.672, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0359, loss_cls: 0.1339, acc: 94.9250, loss_bbox: 0.1829, loss_mask: 0.2043, loss: 0.5684 2024-05-30 03:07:20,320 - mmdet - INFO - Epoch [12][4300/7330] lr: 1.000e-06, eta: 0:34:50, time: 0.677, data_time: 0.064, memory: 11628, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0410, loss_cls: 0.1507, acc: 94.2214, loss_bbox: 0.2054, loss_mask: 0.2178, loss: 0.6282 2024-05-30 03:07:53,511 - mmdet - INFO - Epoch [12][4350/7330] lr: 1.000e-06, eta: 0:34:15, time: 0.664, data_time: 0.040, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0369, loss_cls: 0.1403, acc: 94.5952, loss_bbox: 0.1897, loss_mask: 0.2104, loss: 0.5893 2024-05-30 03:08:30,022 - mmdet - INFO - Epoch [12][4400/7330] lr: 1.000e-06, eta: 0:33:41, time: 0.730, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0382, loss_cls: 0.1443, acc: 94.5068, loss_bbox: 0.2007, loss_mask: 0.2049, loss: 0.5999 2024-05-30 03:09:07,191 - mmdet - INFO - Epoch [12][4450/7330] lr: 1.000e-06, eta: 0:33:07, time: 0.743, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0397, loss_cls: 0.1454, acc: 94.3867, loss_bbox: 0.1952, loss_mask: 0.2099, loss: 0.6021 2024-05-30 03:09:47,668 - mmdet - INFO - Epoch [12][4500/7330] lr: 1.000e-06, eta: 0:32:32, time: 0.809, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0372, loss_cls: 0.1414, acc: 94.6340, loss_bbox: 0.1944, loss_mask: 0.2118, loss: 0.5971 2024-05-30 03:10:22,977 - mmdet - INFO - Epoch [12][4550/7330] lr: 1.000e-06, eta: 0:31:58, time: 0.706, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0344, loss_cls: 0.1355, acc: 94.8054, loss_bbox: 0.1879, loss_mask: 0.2067, loss: 0.5753 2024-05-30 03:10:59,060 - mmdet - INFO - Epoch [12][4600/7330] lr: 1.000e-06, eta: 0:31:23, time: 0.722, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0385, loss_cls: 0.1499, acc: 94.2344, loss_bbox: 0.2046, loss_mask: 0.2135, loss: 0.6189 2024-05-30 03:11:32,900 - mmdet - INFO - Epoch [12][4650/7330] lr: 1.000e-06, eta: 0:30:49, time: 0.677, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0368, loss_cls: 0.1451, acc: 94.4187, loss_bbox: 0.1985, loss_mask: 0.2096, loss: 0.6024 2024-05-30 03:12:06,358 - mmdet - INFO - Epoch [12][4700/7330] lr: 1.000e-06, eta: 0:30:14, time: 0.669, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0375, loss_cls: 0.1432, acc: 94.5840, loss_bbox: 0.1955, loss_mask: 0.2096, loss: 0.5973 2024-05-30 03:12:42,802 - mmdet - INFO - Epoch [12][4750/7330] lr: 1.000e-06, eta: 0:29:40, time: 0.729, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0386, loss_cls: 0.1457, acc: 94.4167, loss_bbox: 0.2005, loss_mask: 0.2146, loss: 0.6114 2024-05-30 03:13:16,184 - mmdet - INFO - Epoch [12][4800/7330] lr: 1.000e-06, eta: 0:29:05, time: 0.668, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0374, loss_cls: 0.1438, acc: 94.5474, loss_bbox: 0.1944, loss_mask: 0.2091, loss: 0.5974 2024-05-30 03:13:49,596 - mmdet - INFO - Epoch [12][4850/7330] lr: 1.000e-06, eta: 0:28:31, time: 0.668, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0358, loss_cls: 0.1447, acc: 94.5186, loss_bbox: 0.1979, loss_mask: 0.2104, loss: 0.6017 2024-05-30 03:14:23,630 - mmdet - INFO - Epoch [12][4900/7330] lr: 1.000e-06, eta: 0:27:56, time: 0.681, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0381, loss_cls: 0.1487, acc: 94.3459, loss_bbox: 0.2039, loss_mask: 0.2116, loss: 0.6149 2024-05-30 03:14:57,578 - mmdet - INFO - Epoch [12][4950/7330] lr: 1.000e-06, eta: 0:27:22, time: 0.679, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0388, loss_cls: 0.1466, acc: 94.4155, loss_bbox: 0.1998, loss_mask: 0.2085, loss: 0.6061 2024-05-30 03:15:31,401 - mmdet - INFO - Epoch [12][5000/7330] lr: 1.000e-06, eta: 0:26:47, time: 0.676, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0374, loss_cls: 0.1389, acc: 94.7009, loss_bbox: 0.1936, loss_mask: 0.2067, loss: 0.5884 2024-05-30 03:16:04,171 - mmdet - INFO - Epoch [12][5050/7330] lr: 1.000e-06, eta: 0:26:13, time: 0.655, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0356, loss_cls: 0.1415, acc: 94.5691, loss_bbox: 0.1930, loss_mask: 0.2139, loss: 0.5953 2024-05-30 03:16:37,331 - mmdet - INFO - Epoch [12][5100/7330] lr: 1.000e-06, eta: 0:25:38, time: 0.663, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0367, loss_cls: 0.1388, acc: 94.6865, loss_bbox: 0.1922, loss_mask: 0.2093, loss: 0.5881 2024-05-30 03:17:11,228 - mmdet - INFO - Epoch [12][5150/7330] lr: 1.000e-06, eta: 0:25:04, time: 0.678, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0384, loss_cls: 0.1397, acc: 94.6777, loss_bbox: 0.1926, loss_mask: 0.2074, loss: 0.5902 2024-05-30 03:17:44,938 - mmdet - INFO - Epoch [12][5200/7330] lr: 1.000e-06, eta: 0:24:29, time: 0.674, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0399, loss_cls: 0.1515, acc: 94.1924, loss_bbox: 0.2059, loss_mask: 0.2182, loss: 0.6283 2024-05-30 03:18:18,639 - mmdet - INFO - Epoch [12][5250/7330] lr: 1.000e-06, eta: 0:23:55, time: 0.674, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0376, loss_cls: 0.1440, acc: 94.4722, loss_bbox: 0.1934, loss_mask: 0.2111, loss: 0.5993 2024-05-30 03:18:57,031 - mmdet - INFO - Epoch [12][5300/7330] lr: 1.000e-06, eta: 0:23:20, time: 0.768, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0373, loss_cls: 0.1462, acc: 94.3882, loss_bbox: 0.2035, loss_mask: 0.2134, loss: 0.6127 2024-05-30 03:19:34,812 - mmdet - INFO - Epoch [12][5350/7330] lr: 1.000e-06, eta: 0:22:46, time: 0.756, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0372, loss_cls: 0.1418, acc: 94.5801, loss_bbox: 0.1935, loss_mask: 0.2095, loss: 0.5944 2024-05-30 03:20:10,388 - mmdet - INFO - Epoch [12][5400/7330] lr: 1.000e-06, eta: 0:22:11, time: 0.711, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0363, loss_cls: 0.1454, acc: 94.4431, loss_bbox: 0.2026, loss_mask: 0.2132, loss: 0.6091 2024-05-30 03:20:48,491 - mmdet - INFO - Epoch [12][5450/7330] lr: 1.000e-06, eta: 0:21:37, time: 0.762, data_time: 0.060, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0395, loss_cls: 0.1521, acc: 94.2019, loss_bbox: 0.2052, loss_mask: 0.2165, loss: 0.6257 2024-05-30 03:21:21,880 - mmdet - INFO - Epoch [12][5500/7330] lr: 1.000e-06, eta: 0:21:02, time: 0.668, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0357, loss_cls: 0.1422, acc: 94.5554, loss_bbox: 0.1944, loss_mask: 0.2129, loss: 0.5960 2024-05-30 03:21:55,514 - mmdet - INFO - Epoch [12][5550/7330] lr: 1.000e-06, eta: 0:20:28, time: 0.673, data_time: 0.061, memory: 11628, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0363, loss_cls: 0.1400, acc: 94.7036, loss_bbox: 0.1915, loss_mask: 0.2028, loss: 0.5815 2024-05-30 03:22:31,299 - mmdet - INFO - Epoch [12][5600/7330] lr: 1.000e-06, eta: 0:19:53, time: 0.716, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0369, loss_cls: 0.1422, acc: 94.5427, loss_bbox: 0.1965, loss_mask: 0.2088, loss: 0.5962 2024-05-30 03:23:04,939 - mmdet - INFO - Epoch [12][5650/7330] lr: 1.000e-06, eta: 0:19:19, time: 0.673, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0366, loss_cls: 0.1408, acc: 94.6548, loss_bbox: 0.1945, loss_mask: 0.2133, loss: 0.5964 2024-05-30 03:23:38,501 - mmdet - INFO - Epoch [12][5700/7330] lr: 1.000e-06, eta: 0:18:44, time: 0.671, data_time: 0.052, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0383, loss_cls: 0.1416, acc: 94.6042, loss_bbox: 0.1942, loss_mask: 0.2064, loss: 0.5937 2024-05-30 03:24:11,284 - mmdet - INFO - Epoch [12][5750/7330] lr: 1.000e-06, eta: 0:18:10, time: 0.656, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0365, loss_cls: 0.1400, acc: 94.7285, loss_bbox: 0.1952, loss_mask: 0.2137, loss: 0.5965 2024-05-30 03:24:45,054 - mmdet - INFO - Epoch [12][5800/7330] lr: 1.000e-06, eta: 0:17:35, time: 0.675, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0363, loss_cls: 0.1380, acc: 94.6511, loss_bbox: 0.1934, loss_mask: 0.2119, loss: 0.5911 2024-05-30 03:25:18,838 - mmdet - INFO - Epoch [12][5850/7330] lr: 1.000e-06, eta: 0:17:01, time: 0.676, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0381, loss_cls: 0.1405, acc: 94.6497, loss_bbox: 0.1947, loss_mask: 0.2077, loss: 0.5924 2024-05-30 03:25:52,671 - mmdet - INFO - Epoch [12][5900/7330] lr: 1.000e-06, eta: 0:16:26, time: 0.677, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0388, loss_cls: 0.1468, acc: 94.4553, loss_bbox: 0.1999, loss_mask: 0.2089, loss: 0.6074 2024-05-30 03:26:26,325 - mmdet - INFO - Epoch [12][5950/7330] lr: 1.000e-06, eta: 0:15:52, time: 0.673, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0392, loss_cls: 0.1460, acc: 94.3330, loss_bbox: 0.2006, loss_mask: 0.2137, loss: 0.6118 2024-05-30 03:27:00,363 - mmdet - INFO - Epoch [12][6000/7330] lr: 1.000e-06, eta: 0:15:17, time: 0.681, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0379, loss_cls: 0.1413, acc: 94.6750, loss_bbox: 0.1932, loss_mask: 0.2111, loss: 0.5958 2024-05-30 03:27:33,589 - mmdet - INFO - Epoch [12][6050/7330] lr: 1.000e-06, eta: 0:14:43, time: 0.665, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0397, loss_cls: 0.1465, acc: 94.3801, loss_bbox: 0.2044, loss_mask: 0.2177, loss: 0.6212 2024-05-30 03:28:06,687 - mmdet - INFO - Epoch [12][6100/7330] lr: 1.000e-06, eta: 0:14:08, time: 0.662, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0369, loss_cls: 0.1482, acc: 94.3494, loss_bbox: 0.2027, loss_mask: 0.2180, loss: 0.6178 2024-05-30 03:28:42,708 - mmdet - INFO - Epoch [12][6150/7330] lr: 1.000e-06, eta: 0:13:34, time: 0.720, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0384, loss_cls: 0.1482, acc: 94.2852, loss_bbox: 0.1995, loss_mask: 0.2121, loss: 0.6106 2024-05-30 03:29:18,516 - mmdet - INFO - Epoch [12][6200/7330] lr: 1.000e-06, eta: 0:12:59, time: 0.716, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0368, loss_cls: 0.1440, acc: 94.4585, loss_bbox: 0.1982, loss_mask: 0.2163, loss: 0.6065 2024-05-30 03:29:58,184 - mmdet - INFO - Epoch [12][6250/7330] lr: 1.000e-06, eta: 0:12:25, time: 0.793, data_time: 0.051, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0394, loss_cls: 0.1484, acc: 94.3250, loss_bbox: 0.2020, loss_mask: 0.2133, loss: 0.6155 2024-05-30 03:30:33,574 - mmdet - INFO - Epoch [12][6300/7330] lr: 1.000e-06, eta: 0:11:50, time: 0.708, data_time: 0.050, memory: 11628, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0379, loss_cls: 0.1432, acc: 94.4475, loss_bbox: 0.2008, loss_mask: 0.2119, loss: 0.6060 2024-05-30 03:31:09,528 - mmdet - INFO - Epoch [12][6350/7330] lr: 1.000e-06, eta: 0:11:16, time: 0.719, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0369, loss_cls: 0.1478, acc: 94.3604, loss_bbox: 0.2007, loss_mask: 0.2141, loss: 0.6107 2024-05-30 03:31:42,709 - mmdet - INFO - Epoch [12][6400/7330] lr: 1.000e-06, eta: 0:10:41, time: 0.664, data_time: 0.043, memory: 11628, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0358, loss_cls: 0.1374, acc: 94.7827, loss_bbox: 0.1923, loss_mask: 0.2057, loss: 0.5823 2024-05-30 03:32:16,604 - mmdet - INFO - Epoch [12][6450/7330] lr: 1.000e-06, eta: 0:10:07, time: 0.678, data_time: 0.054, memory: 11628, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0355, loss_cls: 0.1403, acc: 94.7102, loss_bbox: 0.1882, loss_mask: 0.2063, loss: 0.5813 2024-05-30 03:32:52,337 - mmdet - INFO - Epoch [12][6500/7330] lr: 1.000e-06, eta: 0:09:32, time: 0.715, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0396, loss_cls: 0.1500, acc: 94.2703, loss_bbox: 0.2041, loss_mask: 0.2110, loss: 0.6172 2024-05-30 03:33:26,124 - mmdet - INFO - Epoch [12][6550/7330] lr: 1.000e-06, eta: 0:08:58, time: 0.676, data_time: 0.058, memory: 11628, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0410, loss_cls: 0.1529, acc: 94.1572, loss_bbox: 0.2120, loss_mask: 0.2174, loss: 0.6369 2024-05-30 03:33:59,260 - mmdet - INFO - Epoch [12][6600/7330] lr: 1.000e-06, eta: 0:08:23, time: 0.663, data_time: 0.042, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0355, loss_cls: 0.1437, acc: 94.5347, loss_bbox: 0.1933, loss_mask: 0.2030, loss: 0.5874 2024-05-30 03:34:32,589 - mmdet - INFO - Epoch [12][6650/7330] lr: 1.000e-06, eta: 0:07:49, time: 0.667, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0388, loss_cls: 0.1441, acc: 94.5171, loss_bbox: 0.2020, loss_mask: 0.2105, loss: 0.6080 2024-05-30 03:35:05,897 - mmdet - INFO - Epoch [12][6700/7330] lr: 1.000e-06, eta: 0:07:14, time: 0.666, data_time: 0.045, memory: 11628, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0350, loss_cls: 0.1390, acc: 94.6963, loss_bbox: 0.1936, loss_mask: 0.2065, loss: 0.5852 2024-05-30 03:35:39,338 - mmdet - INFO - Epoch [12][6750/7330] lr: 1.000e-06, eta: 0:06:40, time: 0.669, data_time: 0.049, memory: 11628, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0352, loss_cls: 0.1423, acc: 94.5652, loss_bbox: 0.1959, loss_mask: 0.2136, loss: 0.5991 2024-05-30 03:36:12,460 - mmdet - INFO - Epoch [12][6800/7330] lr: 1.000e-06, eta: 0:06:05, time: 0.662, data_time: 0.047, memory: 11628, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0360, loss_cls: 0.1387, acc: 94.7297, loss_bbox: 0.1880, loss_mask: 0.2050, loss: 0.5785 2024-05-30 03:36:46,650 - mmdet - INFO - Epoch [12][6850/7330] lr: 1.000e-06, eta: 0:05:31, time: 0.684, data_time: 0.068, memory: 11628, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0421, loss_cls: 0.1524, acc: 94.1960, loss_bbox: 0.2068, loss_mask: 0.2142, loss: 0.6287 2024-05-30 03:37:19,568 - mmdet - INFO - Epoch [12][6900/7330] lr: 1.000e-06, eta: 0:04:56, time: 0.659, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0364, loss_cls: 0.1418, acc: 94.6509, loss_bbox: 0.1942, loss_mask: 0.2094, loss: 0.5944 2024-05-30 03:37:53,155 - mmdet - INFO - Epoch [12][6950/7330] lr: 1.000e-06, eta: 0:04:22, time: 0.672, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0359, loss_cls: 0.1392, acc: 94.6528, loss_bbox: 0.1952, loss_mask: 0.2083, loss: 0.5904 2024-05-30 03:38:26,661 - mmdet - INFO - Epoch [12][7000/7330] lr: 1.000e-06, eta: 0:03:47, time: 0.670, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0381, loss_cls: 0.1459, acc: 94.3757, loss_bbox: 0.1952, loss_mask: 0.2092, loss: 0.6004 2024-05-30 03:39:04,219 - mmdet - INFO - Epoch [12][7050/7330] lr: 1.000e-06, eta: 0:03:13, time: 0.751, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0366, loss_cls: 0.1400, acc: 94.6868, loss_bbox: 0.1919, loss_mask: 0.2107, loss: 0.5900 2024-05-30 03:39:43,789 - mmdet - INFO - Epoch [12][7100/7330] lr: 1.000e-06, eta: 0:02:38, time: 0.791, data_time: 0.046, memory: 11628, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0388, loss_cls: 0.1504, acc: 94.3032, loss_bbox: 0.2029, loss_mask: 0.2114, loss: 0.6164 2024-05-30 03:40:20,016 - mmdet - INFO - Epoch [12][7150/7330] lr: 1.000e-06, eta: 0:02:04, time: 0.725, data_time: 0.044, memory: 11628, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0344, loss_cls: 0.1305, acc: 95.0239, loss_bbox: 0.1798, loss_mask: 0.2036, loss: 0.5579 2024-05-30 03:40:58,256 - mmdet - INFO - Epoch [12][7200/7330] lr: 1.000e-06, eta: 0:01:29, time: 0.765, data_time: 0.056, memory: 11628, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0376, loss_cls: 0.1442, acc: 94.4641, loss_bbox: 0.1965, loss_mask: 0.2095, loss: 0.6007 2024-05-30 03:41:31,488 - mmdet - INFO - Epoch [12][7250/7330] lr: 1.000e-06, eta: 0:00:55, time: 0.665, data_time: 0.053, memory: 11628, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0371, loss_cls: 0.1427, acc: 94.5403, loss_bbox: 0.1948, loss_mask: 0.2115, loss: 0.5978 2024-05-30 03:42:04,907 - mmdet - INFO - Epoch [12][7300/7330] lr: 1.000e-06, eta: 0:00:20, time: 0.668, data_time: 0.048, memory: 11628, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0386, loss_cls: 0.1483, acc: 94.3694, loss_bbox: 0.2006, loss_mask: 0.2131, loss: 0.6137 2024-05-30 03:42:25,916 - mmdet - INFO - Saving checkpoint at 12 epochs 2024-05-30 03:44:13,560 - mmdet - INFO - Evaluating bbox... 2024-05-30 03:44:34,518 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.465 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.684 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.504 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.274 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.508 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.631 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.579 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.579 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.579 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.379 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.626 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.752 2024-05-30 03:44:34,519 - mmdet - INFO - Evaluating segm... 2024-05-30 03:44:59,793 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.413 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.650 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.444 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.192 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.449 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.625 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.518 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.518 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.518 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.304 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.565 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.708 2024-05-30 03:45:00,112 - mmdet - INFO - Exp name: mask_rcnn_deit_tsb_1568_1120_672_fpn_1x_coco_bs16.py 2024-05-30 03:45:00,114 - mmdet - INFO - Epoch(val) [12][625] bbox_mAP: 0.4650, bbox_mAP_50: 0.6840, bbox_mAP_75: 0.5040, bbox_mAP_s: 0.2740, bbox_mAP_m: 0.5080, bbox_mAP_l: 0.6310, bbox_mAP_copypaste: 0.465 0.684 0.504 0.274 0.508 0.631, segm_mAP: 0.4130, segm_mAP_50: 0.6500, segm_mAP_75: 0.4440, segm_mAP_s: 0.1920, segm_mAP_m: 0.4490, segm_mAP_l: 0.6250, segm_mAP_copypaste: 0.413 0.650 0.444 0.192 0.449 0.625