2023/05/31 18:32:47 - mmengine - INFO - ------------------------------------------------------------ System environment: sys.platform: linux Python: 3.10.9 (main, Mar 8 2023, 10:47:38) [GCC 11.2.0] CUDA available: True numpy_random_seed: 530487989 GPU 0,1,2,3,4,5,6,7: NVIDIA A100-SXM4-80GB CUDA_HOME: /mnt/petrelfs/share/cuda-11.6 NVCC: Cuda compilation tools, release 11.6, V11.6.124 GCC: gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-44) PyTorch: 1.13.1 PyTorch compiling details: PyTorch built with: - GCC 9.3 - C++ Version: 201402 - Intel(R) oneAPI Math Kernel Library Version 2021.4-Product Build 20210904 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v2.6.0 (Git Hash 52b5f107dd9cf10910aaa19cb47f3abf9b349815) - OpenMP 201511 (a.k.a. OpenMP 4.5) - LAPACK is enabled (usually provided by MKL) - NNPACK is enabled - CPU capability usage: AVX2 - CUDA Runtime 11.6 - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86;-gencode;arch=compute_37,code=compute_37 - CuDNN 8.3.2 (built against CUDA 11.5) - Magma 2.6.1 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.6, CUDNN_VERSION=8.3.2, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -fabi-version=11 -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Werror=non-virtual-dtor -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wunused-local-typedefs -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.13.1, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, TorchVision: 0.14.1 OpenCV: 4.7.0 MMEngine: 0.7.3 Runtime environment: cudnn_benchmark: True mp_cfg: {'mp_start_method': 'fork', 'opencv_num_threads': 0} dist_cfg: {'backend': 'nccl'} seed: None deterministic: False Distributed launcher: slurm Distributed training: True GPU number: 8 ------------------------------------------------------------ 2023/05/31 18:32:48 - mmengine - INFO - Config: optim_wrapper = dict( optimizer=dict( type='AdamW', lr=0.001, weight_decay=0.05, eps=1e-08, betas=(0.9, 0.999), _scope_='mmpretrain'), paramwise_cfg=dict( norm_decay_mult=0.0, bias_decay_mult=0.0, flat_decay_mult=0.0, custom_keys=dict({ '.absolute_pos_embed': dict(decay_mult=0.0), '.relative_position_bias_table': dict(decay_mult=0.0) })), type='AmpOptimWrapper', dtype='bfloat16', clip_grad=dict(max_norm=5.0)) param_scheduler = [ dict(type='CosineAnnealingLR', eta_min=1e-05, by_epoch=True, begin=0) ] train_cfg = dict(by_epoch=True, max_epochs=20, val_interval=1) val_cfg = dict() test_cfg = dict() auto_scale_lr = dict(base_batch_size=1024) model = dict( type='ImageClassifier', backbone=dict( type='SwinTransformer', arch='base', img_size=224, drop_path_rate=0.5), neck=dict(type='GlobalAveragePooling'), head=dict( type='LinearClsHead', num_classes=2, in_channels=1024, init_cfg=None, loss=dict( type='LabelSmoothLoss', label_smooth_val=0.1, mode='original'), cal_acc=False), init_cfg=[ dict(type='TruncNormal', layer='Linear', std=0.02, bias=0.0), dict(type='Constant', layer='LayerNorm', val=1.0, bias=0.0) ], train_cfg=None) dataset_type = 'CustomDataset' data_preprocessor = dict( num_classes=2, mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True) bgr_mean = [103.53, 116.28, 123.675] bgr_std = [57.375, 57.12, 58.395] train_pipeline = [ dict(type='LoadImageFromFile'), dict( type='RandomResizedCrop', scale=224, backend='pillow', interpolation='bicubic'), dict(type='RandomFlip', prob=0.5, direction='horizontal'), dict(type='PackInputs') ] test_pipeline = [ dict(type='LoadImageFromFile'), dict( type='ResizeEdge', scale=256, edge='short', backend='pillow', interpolation='bicubic'), dict(type='CenterCrop', crop_size=224), dict(type='PackInputs') ] train_dataloader = dict( pin_memory=True, persistent_workers=True, collate_fn=dict(type='default_collate'), batch_size=128, num_workers=10, dataset=dict( type='CustomDataset', data_root='', ann_file= '/mnt/petrelfs/luzeyu/workspace/fakebench/dataset/meta/train/stablediffusionV1-5R2-dpmsolver-25-5m.csv', pipeline=[ dict(type='LoadImageFromFile'), dict( type='RandomResizedCrop', scale=224, backend='pillow', interpolation='bicubic'), dict(type='RandomFlip', prob=0.5, direction='horizontal'), dict(type='PackInputs') ]), sampler=dict(type='DefaultSampler', shuffle=True)) val_dataloader = dict( pin_memory=True, persistent_workers=True, collate_fn=dict(type='default_collate'), batch_size=256, num_workers=10, dataset=dict( type='CustomDataset', data_root='/mnt/petrelfs/luzeyu/workspace/fakebench/dataset', ann_file= '/mnt/petrelfs/luzeyu/workspace/fakebench/dataset/meta/val/stablediffusionV1-5R2-dpmsolver-25-1w.tsv', pipeline=[ dict(type='LoadImageFromFile'), dict( type='ResizeEdge', scale=256, edge='short', backend='pillow', interpolation='bicubic'), dict(type='CenterCrop', crop_size=224), dict(type='PackInputs') ]), sampler=dict(type='DefaultSampler', shuffle=False)) val_evaluator = [ dict(type='Accuracy', topk=1), dict(type='SingleLabelMetric', average=None) ] test_dataloader = dict( pin_memory=True, persistent_workers=True, collate_fn=dict(type='default_collate'), batch_size=256, num_workers=10, dataset=dict( type='CustomDataset', data_root='/mnt/petrelfs/luzeyu/workspace/fakebench/dataset', ann_file= '/mnt/petrelfs/luzeyu/workspace/fakebench/dataset/meta/val/stablediffusionV1-5R2-dpmsolver-25-1w.tsv', pipeline=[ dict(type='LoadImageFromFile'), dict( type='ResizeEdge', scale=256, edge='short', backend='pillow', interpolation='bicubic'), dict(type='CenterCrop', crop_size=224), dict(type='PackInputs') ]), sampler=dict(type='DefaultSampler', shuffle=False)) test_evaluator = [ dict(type='Accuracy', topk=1), dict(type='SingleLabelMetric', average=None) ] default_scope = 'mmpretrain' default_hooks = dict( timer=dict(type='IterTimerHook'), logger=dict(type='LoggerHook', interval=100), param_scheduler=dict(type='ParamSchedulerHook'), checkpoint=dict(type='CheckpointHook', interval=1), sampler_seed=dict(type='DistSamplerSeedHook'), visualization=dict(type='VisualizationHook', enable=True)) env_cfg = dict( cudnn_benchmark=True, mp_cfg=dict(mp_start_method='fork', opencv_num_threads=0), dist_cfg=dict(backend='nccl')) vis_backends = [dict(type='LocalVisBackend')] visualizer = dict( type='UniversalVisualizer', vis_backends=[ dict(type='LocalVisBackend'), dict(type='TensorboardVisBackend') ]) log_level = 'INFO' load_from = None resume = False randomness = dict(seed=None, deterministic=False) launcher = 'slurm' work_dir = 'workdir/swin_base_8xb128_1e-3lr_5m' 2023/05/31 18:32:52 - mmengine - INFO - Hooks will be executed in the following order: before_run: (VERY_HIGH ) RuntimeInfoHook (BELOW_NORMAL) LoggerHook -------------------- before_train: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (VERY_LOW ) CheckpointHook -------------------- before_train_epoch: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (NORMAL ) DistSamplerSeedHook -------------------- before_train_iter: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook -------------------- after_train_iter: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (BELOW_NORMAL) LoggerHook (LOW ) ParamSchedulerHook (VERY_LOW ) CheckpointHook -------------------- after_train_epoch: (NORMAL ) IterTimerHook (LOW ) ParamSchedulerHook (VERY_LOW ) CheckpointHook -------------------- before_val_epoch: (NORMAL ) IterTimerHook -------------------- before_val_iter: (NORMAL ) IterTimerHook -------------------- after_val_iter: (NORMAL ) IterTimerHook (NORMAL ) VisualizationHook (BELOW_NORMAL) LoggerHook -------------------- after_val_epoch: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (BELOW_NORMAL) LoggerHook (LOW ) ParamSchedulerHook (VERY_LOW ) CheckpointHook -------------------- after_train: (VERY_LOW ) CheckpointHook -------------------- before_test_epoch: (NORMAL ) IterTimerHook -------------------- before_test_iter: (NORMAL ) IterTimerHook -------------------- after_test_iter: (NORMAL ) IterTimerHook (NORMAL ) VisualizationHook (BELOW_NORMAL) LoggerHook -------------------- after_test_epoch: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (BELOW_NORMAL) LoggerHook -------------------- after_run: (BELOW_NORMAL) LoggerHook -------------------- 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.patch_embed.projection.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.patch_embed.norm.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.patch_embed.norm.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.0.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.0.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.0.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.0.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.0.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.0.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.0.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.0.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.0.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.0.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.0.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.1.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.1.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.1.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.1.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.1.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.1.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.1.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.1.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.1.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.1.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.blocks.1.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.downsample.norm.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.0.downsample.norm.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.0.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.0.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.0.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.0.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.0.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.0.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.0.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.0.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.0.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.0.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.0.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.1.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.1.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.1.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.1.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.1.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.1.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.1.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.1.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.1.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.1.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.blocks.1.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.downsample.norm.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.1.downsample.norm.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.0.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.0.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.0.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.0.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.0.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.0.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.0.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.0.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.0.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.0.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.0.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.1.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.1.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.1.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.1.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.1.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.1.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.1.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.1.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.1.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.1.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.1.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.2.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.2.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.2.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.2.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.2.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.2.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.2.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.2.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.2.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.2.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.2.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.3.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.3.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.3.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.3.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.3.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.3.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.3.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.3.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.3.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.3.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.3.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.4.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.4.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.4.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.4.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.4.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.4.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.4.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.4.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.4.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.4.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.4.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.5.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.5.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.5.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.5.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.5.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.5.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.5.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.5.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.5.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.5.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.5.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.6.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.6.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.6.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.6.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.6.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.6.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.6.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.6.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.6.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.6.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.6.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.7.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.7.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.7.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.7.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.7.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.7.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.7.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.7.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.7.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.7.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.7.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.8.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.8.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.8.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.8.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.8.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.8.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.8.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.8.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.8.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.8.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.8.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.9.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.9.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.9.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.9.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.9.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.9.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.9.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.9.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.9.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.9.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.9.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.10.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.10.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.10.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.10.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.10.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.10.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.10.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.10.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.10.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.10.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.10.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.11.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.11.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.11.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.11.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.11.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.11.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.11.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.11.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.11.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.11.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.11.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.12.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.12.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.12.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.12.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.12.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.12.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.12.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.12.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.12.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.12.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.12.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.13.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.13.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.13.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.13.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.13.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.13.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.13.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.13.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.13.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.13.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.13.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.14.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.14.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.14.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.14.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.14.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.14.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.14.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.14.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.14.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.14.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.14.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.15.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.15.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.15.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.15.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.15.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.15.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.15.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.15.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.15.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.15.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.15.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.16.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.16.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.16.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.16.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.16.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.16.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.16.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.16.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.16.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.16.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.16.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.17.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.17.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.17.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.17.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.17.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.17.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.17.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.17.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.17.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.17.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.blocks.17.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.downsample.norm.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.2.downsample.norm.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.0.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.0.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.0.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.0.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.0.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.0.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.0.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.0.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.0.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.0.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.0.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.1.norm1.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.1.norm1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.1.attn.w_msa.relative_position_bias_table:lr=0.001 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.1.attn.w_msa.relative_position_bias_table:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.1.attn.w_msa.relative_position_bias_table:decay_mult=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.1.attn.w_msa.qkv.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.1.attn.w_msa.proj.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.1.norm2.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.1.norm2.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.1.ffn.layers.0.0.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.stages.3.blocks.1.ffn.layers.1.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.norm3.weight:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- backbone.norm3.bias:weight_decay=0.0 2023/05/31 18:33:19 - mmengine - INFO - paramwise_options -- head.fc.bias:weight_decay=0.0 Name of parameter - Initialization information backbone.patch_embed.projection.weight - torch.Size([128, 3, 4, 4]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.patch_embed.projection.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.patch_embed.norm.weight - torch.Size([128]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.patch_embed.norm.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.0.blocks.0.norm1.weight - torch.Size([128]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.0.blocks.0.norm1.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.0.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([169, 4]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.0.blocks.0.attn.w_msa.qkv.weight - torch.Size([384, 128]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.0.attn.w_msa.qkv.bias - torch.Size([384]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.0.attn.w_msa.proj.weight - torch.Size([128, 128]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.0.attn.w_msa.proj.bias - torch.Size([128]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.0.norm2.weight - torch.Size([128]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.0.blocks.0.norm2.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.0.blocks.0.ffn.layers.0.0.weight - torch.Size([512, 128]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.0.ffn.layers.0.0.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.0.ffn.layers.1.weight - torch.Size([128, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.0.ffn.layers.1.bias - torch.Size([128]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.1.norm1.weight - torch.Size([128]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.0.blocks.1.norm1.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.0.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([169, 4]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.0.blocks.1.attn.w_msa.qkv.weight - torch.Size([384, 128]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.1.attn.w_msa.qkv.bias - torch.Size([384]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.1.attn.w_msa.proj.weight - torch.Size([128, 128]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.1.attn.w_msa.proj.bias - torch.Size([128]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.1.norm2.weight - torch.Size([128]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.0.blocks.1.norm2.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.0.blocks.1.ffn.layers.0.0.weight - torch.Size([512, 128]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.1.ffn.layers.0.0.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.1.ffn.layers.1.weight - torch.Size([128, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.blocks.1.ffn.layers.1.bias - torch.Size([128]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.0.downsample.norm.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.0.downsample.norm.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.0.downsample.reduction.weight - torch.Size([256, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.0.norm1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.1.blocks.0.norm1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.1.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([169, 8]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.1.blocks.0.attn.w_msa.qkv.weight - torch.Size([768, 256]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.0.attn.w_msa.qkv.bias - torch.Size([768]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.0.attn.w_msa.proj.weight - torch.Size([256, 256]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.0.attn.w_msa.proj.bias - torch.Size([256]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.0.norm2.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.1.blocks.0.norm2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.1.blocks.0.ffn.layers.0.0.weight - torch.Size([1024, 256]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.0.ffn.layers.0.0.bias - torch.Size([1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.0.ffn.layers.1.weight - torch.Size([256, 1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.0.ffn.layers.1.bias - torch.Size([256]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.1.norm1.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.1.blocks.1.norm1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.1.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([169, 8]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.1.blocks.1.attn.w_msa.qkv.weight - torch.Size([768, 256]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.1.attn.w_msa.qkv.bias - torch.Size([768]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.1.attn.w_msa.proj.weight - torch.Size([256, 256]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.1.attn.w_msa.proj.bias - torch.Size([256]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.1.norm2.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.1.blocks.1.norm2.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.1.blocks.1.ffn.layers.0.0.weight - torch.Size([1024, 256]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.1.ffn.layers.0.0.bias - torch.Size([1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.1.ffn.layers.1.weight - torch.Size([256, 1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.blocks.1.ffn.layers.1.bias - torch.Size([256]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.1.downsample.norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.1.downsample.norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.1.downsample.reduction.weight - torch.Size([512, 1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.0.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.0.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.0.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.0.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.0.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.0.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.0.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.0.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.0.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.0.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.0.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.0.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.1.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.1.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.1.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.1.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.1.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.1.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.1.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.1.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.1.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.1.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.1.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.1.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.2.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.2.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.2.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.2.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.2.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.2.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.2.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.2.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.2.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.2.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.2.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.2.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.2.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.3.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.3.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.3.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.3.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.3.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.3.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.3.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.3.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.3.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.3.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.3.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.3.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.3.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.4.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.4.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.4.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.4.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.4.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.4.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.4.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.4.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.4.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.4.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.4.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.4.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.4.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.5.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.5.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.5.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.5.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.5.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.5.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.5.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.5.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.5.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.5.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.5.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.5.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.5.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.6.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.6.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.6.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.6.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.6.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.6.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.6.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.6.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.6.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.6.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.6.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.6.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.6.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.7.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.7.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.7.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.7.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.7.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.7.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.7.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.7.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.7.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.7.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.7.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.7.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.7.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.8.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.8.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.8.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.8.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.8.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.8.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.8.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.8.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.8.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.8.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.8.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.8.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.8.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.9.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.9.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.9.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.9.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.9.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.9.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.9.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.9.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.9.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.9.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.9.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.9.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.9.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.10.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.10.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.10.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.10.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.10.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.10.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.10.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.10.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.10.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.10.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.10.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.10.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.10.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.11.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.11.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.11.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.11.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.11.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.11.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.11.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.11.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.11.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.11.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.11.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.11.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.11.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.12.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.12.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.12.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.12.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.12.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.12.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.12.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.12.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.12.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.12.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.12.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.12.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.12.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.13.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.13.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.13.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.13.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.13.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.13.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.13.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.13.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.13.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.13.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.13.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.13.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.13.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.14.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.14.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.14.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.14.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.14.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.14.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.14.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.14.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.14.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.14.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.14.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.14.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.14.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.15.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.15.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.15.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.15.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.15.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.15.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.15.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.15.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.15.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.15.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.15.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.15.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.15.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.16.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.16.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.16.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.16.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.16.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.16.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.16.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.16.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.16.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.16.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.16.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.16.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.16.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.17.norm1.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.17.norm1.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.17.attn.w_msa.relative_position_bias_table - torch.Size([169, 16]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.2.blocks.17.attn.w_msa.qkv.weight - torch.Size([1536, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.17.attn.w_msa.qkv.bias - torch.Size([1536]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.17.attn.w_msa.proj.weight - torch.Size([512, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.17.attn.w_msa.proj.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.17.norm2.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.17.norm2.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.blocks.17.ffn.layers.0.0.weight - torch.Size([2048, 512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.17.ffn.layers.0.0.bias - torch.Size([2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.17.ffn.layers.1.weight - torch.Size([512, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.blocks.17.ffn.layers.1.bias - torch.Size([512]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.2.downsample.norm.weight - torch.Size([2048]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.downsample.norm.bias - torch.Size([2048]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.2.downsample.reduction.weight - torch.Size([1024, 2048]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.0.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.3.blocks.0.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.3.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([169, 32]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.3.blocks.0.attn.w_msa.qkv.weight - torch.Size([3072, 1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.0.attn.w_msa.qkv.bias - torch.Size([3072]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.0.attn.w_msa.proj.weight - torch.Size([1024, 1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.0.attn.w_msa.proj.bias - torch.Size([1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.0.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.3.blocks.0.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.3.blocks.0.ffn.layers.0.0.weight - torch.Size([4096, 1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.0.ffn.layers.0.0.bias - torch.Size([4096]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.0.ffn.layers.1.weight - torch.Size([1024, 4096]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.0.ffn.layers.1.bias - torch.Size([1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.1.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.3.blocks.1.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.3.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([169, 32]): Initialized by user-defined `init_weights` in WindowMSA backbone.stages.3.blocks.1.attn.w_msa.qkv.weight - torch.Size([3072, 1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.1.attn.w_msa.qkv.bias - torch.Size([3072]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.1.attn.w_msa.proj.weight - torch.Size([1024, 1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.1.attn.w_msa.proj.bias - torch.Size([1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.1.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.3.blocks.1.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.stages.3.blocks.1.ffn.layers.0.0.weight - torch.Size([4096, 1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.1.ffn.layers.0.0.bias - torch.Size([4096]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.1.ffn.layers.1.weight - torch.Size([1024, 4096]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.stages.3.blocks.1.ffn.layers.1.bias - torch.Size([1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 backbone.norm3.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of ImageClassifier backbone.norm3.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of ImageClassifier head.fc.weight - torch.Size([2, 1024]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 head.fc.bias - torch.Size([2]): TruncNormalInit: a=-2, b=2, mean=0, std=0.02, bias=0.0 2023/05/31 18:33:19 - mmengine - WARNING - "FileClient" will be deprecated in future. Please use io functions in https://mmengine.readthedocs.io/en/latest/api/fileio.html#file-io 2023/05/31 18:33:19 - mmengine - WARNING - "HardDiskBackend" is the alias of "LocalBackend" and the former will be deprecated in future. 2023/05/31 18:33:19 - mmengine - INFO - Checkpoints will be saved to /mnt/petrelfs/luzeyu/workspace/fakebench/mmpretrain/workdir/swin_base_8xb128_1e-3lr_5m. 2023/05/31 18:34:13 - mmengine - INFO - Epoch(train) [1][ 100/5758] lr: 1.0000e-03 eta: 17:01:09 time: 0.4505 data_time: 0.0015 memory: 20328 grad_norm: 1.2535 loss: 0.6617 2023/05/31 18:34:59 - mmengine - INFO - Epoch(train) [1][ 200/5758] lr: 1.0000e-03 eta: 15:50:29 time: 0.4186 data_time: 0.0021 memory: 20327 grad_norm: 1.3020 loss: 0.6787 2023/05/31 18:35:45 - mmengine - INFO - Epoch(train) [1][ 300/5758] lr: 1.0000e-03 eta: 15:31:14 time: 0.4118 data_time: 0.0016 memory: 20327 grad_norm: 1.6164 loss: 0.6962 2023/05/31 18:36:32 - mmengine - INFO - Epoch(train) [1][ 400/5758] lr: 1.0000e-03 eta: 15:21:04 time: 0.4402 data_time: 0.0017 memory: 20327 grad_norm: 0.9162 loss: 0.6910 2023/05/31 18:37:18 - mmengine - INFO - Epoch(train) [1][ 500/5758] lr: 1.0000e-03 eta: 15:10:26 time: 0.4888 data_time: 0.0025 memory: 20327 grad_norm: 0.7051 loss: 0.6896 2023/05/31 18:38:04 - mmengine - INFO - Epoch(train) [1][ 600/5758] lr: 1.0000e-03 eta: 15:04:36 time: 0.4830 data_time: 0.0025 memory: 20327 grad_norm: 1.0663 loss: 0.6936 2023/05/31 18:38:50 - mmengine - INFO - Epoch(train) [1][ 700/5758] lr: 1.0000e-03 eta: 15:00:17 time: 0.4459 data_time: 0.0024 memory: 20327 grad_norm: 0.6849 loss: 0.6929 2023/05/31 18:39:35 - mmengine - INFO - Epoch(train) [1][ 800/5758] lr: 1.0000e-03 eta: 14:55:29 time: 0.4738 data_time: 0.0020 memory: 20327 grad_norm: 0.4387 loss: 0.6870 2023/05/31 18:40:20 - mmengine - INFO - Epoch(train) [1][ 900/5758] lr: 1.0000e-03 eta: 14:49:50 time: 0.4850 data_time: 0.0022 memory: 20327 grad_norm: 0.7461 loss: 0.6842 2023/05/31 18:41:05 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 18:41:05 - mmengine - INFO - Epoch(train) [1][1000/5758] lr: 1.0000e-03 eta: 14:46:26 time: 0.5281 data_time: 0.0018 memory: 20327 grad_norm: 0.5114 loss: 0.6821 2023/05/31 18:41:50 - mmengine - INFO - Epoch(train) [1][1100/5758] lr: 1.0000e-03 eta: 14:42:37 time: 0.4209 data_time: 0.0019 memory: 20327 grad_norm: 0.4384 loss: 0.6854 2023/05/31 18:42:36 - mmengine - INFO - Epoch(train) [1][1200/5758] lr: 1.0000e-03 eta: 14:40:45 time: 0.4580 data_time: 0.0018 memory: 20327 grad_norm: 0.5817 loss: 0.6899 2023/05/31 18:43:21 - mmengine - INFO - Epoch(train) [1][1300/5758] lr: 1.0000e-03 eta: 14:38:28 time: 0.4686 data_time: 0.0014 memory: 20327 grad_norm: 0.4683 loss: 0.6840 2023/05/31 18:44:07 - mmengine - INFO - Epoch(train) [1][1400/5758] lr: 1.0000e-03 eta: 14:36:32 time: 0.4447 data_time: 0.0020 memory: 20327 grad_norm: 0.4274 loss: 0.6886 2023/05/31 18:44:51 - mmengine - INFO - Epoch(train) [1][1500/5758] lr: 1.0000e-03 eta: 14:32:48 time: 0.4260 data_time: 0.0025 memory: 20327 grad_norm: 0.3759 loss: 0.6896 2023/05/31 18:45:35 - mmengine - INFO - Epoch(train) [1][1600/5758] lr: 1.0000e-03 eta: 14:30:42 time: 0.4184 data_time: 0.0017 memory: 20327 grad_norm: 0.4764 loss: 0.6906 2023/05/31 18:46:21 - mmengine - INFO - Epoch(train) [1][1700/5758] lr: 1.0000e-03 eta: 14:29:02 time: 0.4273 data_time: 0.0023 memory: 20327 grad_norm: 0.3174 loss: 0.6887 2023/05/31 18:47:08 - mmengine - INFO - Epoch(train) [1][1800/5758] lr: 1.0000e-03 eta: 14:29:39 time: 0.4459 data_time: 0.0526 memory: 20327 grad_norm: 0.2392 loss: 0.6858 2023/05/31 18:47:56 - mmengine - INFO - Epoch(train) [1][1900/5758] lr: 1.0000e-03 eta: 14:31:11 time: 0.5703 data_time: 0.0018 memory: 20327 grad_norm: 0.3035 loss: 0.6881 2023/05/31 18:48:46 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 18:48:46 - mmengine - INFO - Epoch(train) [1][2000/5758] lr: 1.0000e-03 eta: 14:33:21 time: 0.5233 data_time: 0.0017 memory: 20327 grad_norm: 0.2468 loss: 0.6844 2023/05/31 18:49:38 - mmengine - INFO - Epoch(train) [1][2100/5758] lr: 1.0000e-03 eta: 14:38:07 time: 0.4493 data_time: 0.0016 memory: 20327 grad_norm: 0.2431 loss: 0.6869 2023/05/31 18:50:28 - mmengine - INFO - Epoch(train) [1][2200/5758] lr: 1.0000e-03 eta: 14:40:04 time: 0.4997 data_time: 0.1156 memory: 20327 grad_norm: 0.2144 loss: 0.6881 2023/05/31 18:51:19 - mmengine - INFO - Epoch(train) [1][2300/5758] lr: 1.0000e-03 eta: 14:42:47 time: 0.4611 data_time: 0.0017 memory: 20327 grad_norm: 0.3295 loss: 0.6890 2023/05/31 18:52:16 - mmengine - INFO - Epoch(train) [1][2400/5758] lr: 1.0000e-03 eta: 14:50:09 time: 0.4768 data_time: 0.0015 memory: 20327 grad_norm: 0.3206 loss: 0.6834 2023/05/31 18:53:12 - mmengine - INFO - Epoch(train) [1][2500/5758] lr: 1.0000e-03 eta: 14:55:23 time: 0.4431 data_time: 0.0020 memory: 20327 grad_norm: 0.1802 loss: 0.6857 2023/05/31 18:54:03 - mmengine - INFO - Epoch(train) [1][2600/5758] lr: 1.0000e-03 eta: 14:57:08 time: 0.5280 data_time: 0.0015 memory: 20327 grad_norm: 0.2828 loss: 0.6845 2023/05/31 18:54:50 - mmengine - INFO - Epoch(train) [1][2700/5758] lr: 1.0000e-03 eta: 14:55:37 time: 0.4529 data_time: 0.0015 memory: 20327 grad_norm: 0.2345 loss: 0.6897 2023/05/31 18:55:38 - mmengine - INFO - Epoch(train) [1][2800/5758] lr: 1.0000e-03 eta: 14:55:23 time: 0.5384 data_time: 0.0018 memory: 20327 grad_norm: 0.1539 loss: 0.6886 2023/05/31 18:56:27 - mmengine - INFO - Epoch(train) [1][2900/5758] lr: 1.0000e-03 eta: 14:55:04 time: 0.4375 data_time: 0.0015 memory: 20327 grad_norm: 0.2427 loss: 0.6880 2023/05/31 18:57:15 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 18:57:15 - mmengine - INFO - Epoch(train) [1][3000/5758] lr: 1.0000e-03 eta: 14:54:23 time: 0.4969 data_time: 0.0014 memory: 20327 grad_norm: 0.2302 loss: 0.6911 2023/05/31 18:58:05 - mmengine - INFO - Epoch(train) [1][3100/5758] lr: 1.0000e-03 eta: 14:55:14 time: 0.5301 data_time: 0.0023 memory: 20327 grad_norm: 0.2076 loss: 0.6825 2023/05/31 18:58:55 - mmengine - INFO - Epoch(train) [1][3200/5758] lr: 1.0000e-03 eta: 14:55:35 time: 0.5860 data_time: 0.0015 memory: 20327 grad_norm: 0.2537 loss: 0.6854 2023/05/31 18:59:46 - mmengine - INFO - Epoch(train) [1][3300/5758] lr: 1.0000e-03 eta: 14:56:15 time: 0.4651 data_time: 0.0025 memory: 20327 grad_norm: 0.1809 loss: 0.6864 2023/05/31 19:00:35 - mmengine - INFO - Epoch(train) [1][3400/5758] lr: 1.0000e-03 eta: 14:55:55 time: 0.4548 data_time: 0.0018 memory: 20327 grad_norm: 0.2350 loss: 0.6890 2023/05/31 19:01:20 - mmengine - INFO - Epoch(train) [1][3500/5758] lr: 1.0000e-03 eta: 14:53:40 time: 0.4564 data_time: 0.0017 memory: 20327 grad_norm: 0.1906 loss: 0.6882 2023/05/31 19:02:07 - mmengine - INFO - Epoch(train) [1][3600/5758] lr: 1.0000e-03 eta: 14:52:00 time: 0.4628 data_time: 0.0016 memory: 20327 grad_norm: 0.2133 loss: 0.6878 2023/05/31 19:02:52 - mmengine - INFO - Epoch(train) [1][3700/5758] lr: 1.0000e-03 eta: 14:50:04 time: 0.4189 data_time: 0.0022 memory: 20327 grad_norm: 0.2188 loss: 0.6868 2023/05/31 19:03:38 - mmengine - INFO - Epoch(train) [1][3800/5758] lr: 1.0000e-03 eta: 14:48:01 time: 0.4756 data_time: 0.0015 memory: 20327 grad_norm: 0.1568 loss: 0.6829 2023/05/31 19:04:23 - mmengine - INFO - Epoch(train) [1][3900/5758] lr: 1.0000e-03 eta: 14:46:13 time: 0.5234 data_time: 0.0014 memory: 20327 grad_norm: 0.2446 loss: 0.6878 2023/05/31 19:05:10 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 19:05:10 - mmengine - INFO - Epoch(train) [1][4000/5758] lr: 1.0000e-03 eta: 14:44:57 time: 0.5318 data_time: 0.0016 memory: 20327 grad_norm: 0.2752 loss: 0.6842 2023/05/31 19:05:57 - mmengine - INFO - Epoch(train) [1][4100/5758] lr: 1.0000e-03 eta: 14:43:44 time: 0.4641 data_time: 0.0014 memory: 20327 grad_norm: 0.1678 loss: 0.6814 2023/05/31 19:06:48 - mmengine - INFO - Epoch(train) [1][4200/5758] lr: 1.0000e-03 eta: 14:44:16 time: 0.6016 data_time: 0.0015 memory: 20327 grad_norm: 0.2206 loss: 0.6909 2023/05/31 19:07:39 - mmengine - INFO - Epoch(train) [1][4300/5758] lr: 1.0000e-03 eta: 14:44:53 time: 0.4643 data_time: 0.0018 memory: 20327 grad_norm: 0.1305 loss: 0.6935 2023/05/31 19:08:32 - mmengine - INFO - Epoch(train) [1][4400/5758] lr: 1.0000e-03 eta: 14:46:22 time: 0.5172 data_time: 0.0018 memory: 20327 grad_norm: 0.1892 loss: 0.6804 2023/05/31 19:09:26 - mmengine - INFO - Epoch(train) [1][4500/5758] lr: 1.0000e-03 eta: 14:47:47 time: 0.5839 data_time: 0.0019 memory: 20327 grad_norm: 4.1434 loss: 0.6887 2023/05/31 19:10:18 - mmengine - INFO - Epoch(train) [1][4600/5758] lr: 1.0000e-03 eta: 14:48:38 time: 0.4826 data_time: 0.0444 memory: 20327 grad_norm: 0.2898 loss: 0.6888 2023/05/31 19:11:11 - mmengine - INFO - Epoch(train) [1][4700/5758] lr: 1.0000e-03 eta: 14:49:48 time: 0.5790 data_time: 0.0118 memory: 20327 grad_norm: 0.2279 loss: 0.6894 2023/05/31 19:11:59 - mmengine - INFO - Epoch(train) [1][4800/5758] lr: 1.0000e-03 eta: 14:48:54 time: 0.4071 data_time: 0.0430 memory: 20327 grad_norm: 0.1455 loss: 0.6874 2023/05/31 19:12:45 - mmengine - INFO - Epoch(train) [1][4900/5758] lr: 1.0000e-03 eta: 14:47:08 time: 0.4641 data_time: 0.0017 memory: 20327 grad_norm: 0.2417 loss: 0.6905 2023/05/31 19:13:31 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 19:13:31 - mmengine - INFO - Epoch(train) [1][5000/5758] lr: 1.0000e-03 eta: 14:45:29 time: 0.4728 data_time: 0.0017 memory: 20327 grad_norm: 0.1863 loss: 0.6864 2023/05/31 19:14:18 - mmengine - INFO - Epoch(train) [1][5100/5758] lr: 1.0000e-03 eta: 14:44:15 time: 0.4919 data_time: 0.0020 memory: 20327 grad_norm: 0.1691 loss: 0.6881 2023/05/31 19:15:03 - mmengine - INFO - Epoch(train) [1][5200/5758] lr: 1.0000e-03 eta: 14:42:21 time: 0.4347 data_time: 0.0016 memory: 20327 grad_norm: 0.2515 loss: 0.6828 2023/05/31 19:15:50 - mmengine - INFO - Epoch(train) [1][5300/5758] lr: 1.0000e-03 eta: 14:41:10 time: 0.4721 data_time: 0.0015 memory: 20327 grad_norm: 0.1911 loss: 0.6846 2023/05/31 19:16:37 - mmengine - INFO - Epoch(train) [1][5400/5758] lr: 1.0000e-03 eta: 14:39:55 time: 0.4641 data_time: 0.0014 memory: 20327 grad_norm: 0.1328 loss: 0.6849 2023/05/31 19:17:24 - mmengine - INFO - Epoch(train) [1][5500/5758] lr: 1.0000e-03 eta: 14:38:56 time: 0.5697 data_time: 0.0017 memory: 20327 grad_norm: 0.1780 loss: 0.6843 2023/05/31 19:18:10 - mmengine - INFO - Epoch(train) [1][5600/5758] lr: 1.0000e-03 eta: 14:37:20 time: 0.3915 data_time: 0.0016 memory: 20327 grad_norm: 0.1942 loss: 0.6819 2023/05/31 19:18:58 - mmengine - INFO - Epoch(train) [1][5700/5758] lr: 1.0000e-03 eta: 14:36:35 time: 0.5306 data_time: 0.0016 memory: 20327 grad_norm: 0.2729 loss: 0.6899 2023/05/31 19:19:24 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 19:19:24 - mmengine - INFO - Saving checkpoint at 1 epochs 2023/05/31 19:19:43 - mmengine - INFO - Epoch(val) [1][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.5274 time: 1.4208 2023/05/31 19:20:35 - mmengine - INFO - Epoch(train) [2][ 100/5758] lr: 9.9391e-04 eta: 14:35:41 time: 0.4719 data_time: 0.0019 memory: 20338 grad_norm: 0.1390 loss: 0.6857 2023/05/31 19:21:21 - mmengine - INFO - Epoch(train) [2][ 200/5758] lr: 9.9391e-04 eta: 14:34:21 time: 0.3786 data_time: 0.0015 memory: 20334 grad_norm: 0.2039 loss: 0.6875 2023/05/31 19:21:39 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 19:22:05 - mmengine - INFO - Epoch(train) [2][ 300/5758] lr: 9.9391e-04 eta: 14:32:13 time: 0.4775 data_time: 0.0023 memory: 20334 grad_norm: 0.1824 loss: 0.6888 2023/05/31 19:22:48 - mmengine - INFO - Epoch(train) [2][ 400/5758] lr: 9.9391e-04 eta: 14:30:01 time: 0.4025 data_time: 0.0020 memory: 20334 grad_norm: 0.1098 loss: 0.6871 2023/05/31 19:23:34 - mmengine - INFO - Epoch(train) [2][ 500/5758] lr: 9.9391e-04 eta: 14:28:41 time: 0.4650 data_time: 0.0020 memory: 20334 grad_norm: 0.1699 loss: 0.6880 2023/05/31 19:24:20 - mmengine - INFO - Epoch(train) [2][ 600/5758] lr: 9.9391e-04 eta: 14:27:18 time: 0.4857 data_time: 0.0022 memory: 20334 grad_norm: 0.2250 loss: 0.6882 2023/05/31 19:25:07 - mmengine - INFO - Epoch(train) [2][ 700/5758] lr: 9.9391e-04 eta: 14:26:28 time: 0.5223 data_time: 0.0028 memory: 20334 grad_norm: 0.0980 loss: 0.6891 2023/05/31 19:25:55 - mmengine - INFO - Epoch(train) [2][ 800/5758] lr: 9.9391e-04 eta: 14:25:44 time: 0.4998 data_time: 0.0017 memory: 20334 grad_norm: 0.2047 loss: 0.6881 2023/05/31 19:26:43 - mmengine - INFO - Epoch(train) [2][ 900/5758] lr: 9.9391e-04 eta: 14:24:48 time: 0.4293 data_time: 0.0015 memory: 20334 grad_norm: 0.1334 loss: 0.6904 2023/05/31 19:27:30 - mmengine - INFO - Epoch(train) [2][1000/5758] lr: 9.9391e-04 eta: 14:23:42 time: 0.5042 data_time: 0.0023 memory: 20334 grad_norm: 0.1210 loss: 0.6860 2023/05/31 19:28:16 - mmengine - INFO - Epoch(train) [2][1100/5758] lr: 9.9391e-04 eta: 14:22:41 time: 0.4424 data_time: 0.0016 memory: 20334 grad_norm: 0.1209 loss: 0.6867 2023/05/31 19:29:04 - mmengine - INFO - Epoch(train) [2][1200/5758] lr: 9.9391e-04 eta: 14:21:42 time: 0.4532 data_time: 0.0015 memory: 20334 grad_norm: 0.1007 loss: 0.6839 2023/05/31 19:29:23 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 19:29:51 - mmengine - INFO - Epoch(train) [2][1300/5758] lr: 9.9391e-04 eta: 14:20:46 time: 0.4828 data_time: 0.0014 memory: 20334 grad_norm: 0.2354 loss: 0.6881 2023/05/31 19:30:35 - mmengine - INFO - Epoch(train) [2][1400/5758] lr: 9.9391e-04 eta: 14:19:09 time: 0.4426 data_time: 0.0014 memory: 20334 grad_norm: 0.1225 loss: 0.6852 2023/05/31 19:31:21 - mmengine - INFO - Epoch(train) [2][1500/5758] lr: 9.9391e-04 eta: 14:17:49 time: 0.4048 data_time: 0.0016 memory: 20334 grad_norm: 0.1605 loss: 0.6865 2023/05/31 19:32:06 - mmengine - INFO - Epoch(train) [2][1600/5758] lr: 9.9391e-04 eta: 14:16:20 time: 0.3803 data_time: 0.0014 memory: 20334 grad_norm: 0.0794 loss: 0.6857 2023/05/31 19:32:50 - mmengine - INFO - Epoch(train) [2][1700/5758] lr: 9.9391e-04 eta: 14:14:44 time: 0.4127 data_time: 0.0015 memory: 20334 grad_norm: 0.1306 loss: 0.6866 2023/05/31 19:33:35 - mmengine - INFO - Epoch(train) [2][1800/5758] lr: 9.9391e-04 eta: 14:13:21 time: 0.4324 data_time: 0.0015 memory: 20334 grad_norm: 0.1867 loss: 0.6870 2023/05/31 19:34:20 - mmengine - INFO - Epoch(train) [2][1900/5758] lr: 9.9391e-04 eta: 14:11:48 time: 0.4226 data_time: 0.0015 memory: 20334 grad_norm: 0.1899 loss: 0.6858 2023/05/31 19:35:05 - mmengine - INFO - Epoch(train) [2][2000/5758] lr: 9.9391e-04 eta: 14:10:28 time: 0.4057 data_time: 0.0016 memory: 20334 grad_norm: 0.1243 loss: 0.6885 2023/05/31 19:35:50 - mmengine - INFO - Epoch(train) [2][2100/5758] lr: 9.9391e-04 eta: 14:09:09 time: 0.4477 data_time: 0.0015 memory: 20334 grad_norm: 0.0797 loss: 0.6879 2023/05/31 19:36:32 - mmengine - INFO - Epoch(train) [2][2200/5758] lr: 9.9391e-04 eta: 14:07:08 time: 0.5015 data_time: 0.0018 memory: 20334 grad_norm: 0.1389 loss: 0.6910 2023/05/31 19:36:51 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 19:37:15 - mmengine - INFO - Epoch(train) [2][2300/5758] lr: 9.9391e-04 eta: 14:05:21 time: 0.3787 data_time: 0.0016 memory: 20334 grad_norm: 0.0999 loss: 0.6889 2023/05/31 19:37:58 - mmengine - INFO - Epoch(train) [2][2400/5758] lr: 9.9391e-04 eta: 14:03:31 time: 0.4515 data_time: 0.0015 memory: 20334 grad_norm: 0.0948 loss: 0.6863 2023/05/31 19:38:40 - mmengine - INFO - Epoch(train) [2][2500/5758] lr: 9.9391e-04 eta: 14:01:44 time: 0.4425 data_time: 0.0016 memory: 20334 grad_norm: 0.1123 loss: 0.6916 2023/05/31 19:39:22 - mmengine - INFO - Epoch(train) [2][2600/5758] lr: 9.9391e-04 eta: 13:59:44 time: 0.4332 data_time: 0.0016 memory: 20334 grad_norm: 0.1154 loss: 0.6823 2023/05/31 19:40:04 - mmengine - INFO - Epoch(train) [2][2700/5758] lr: 9.9391e-04 eta: 13:57:53 time: 0.4113 data_time: 0.0015 memory: 20334 grad_norm: 0.1471 loss: 0.6900 2023/05/31 19:40:46 - mmengine - INFO - Epoch(train) [2][2800/5758] lr: 9.9391e-04 eta: 13:56:04 time: 0.3838 data_time: 0.0015 memory: 20334 grad_norm: 0.1072 loss: 0.6860 2023/05/31 19:41:29 - mmengine - INFO - Epoch(train) [2][2900/5758] lr: 9.9391e-04 eta: 13:54:28 time: 0.4390 data_time: 0.0016 memory: 20334 grad_norm: 0.1075 loss: 0.6898 2023/05/31 19:42:10 - mmengine - INFO - Epoch(train) [2][3000/5758] lr: 9.9391e-04 eta: 13:52:26 time: 0.3877 data_time: 0.0016 memory: 20334 grad_norm: 0.1138 loss: 0.6888 2023/05/31 19:42:52 - mmengine - INFO - Epoch(train) [2][3100/5758] lr: 9.9391e-04 eta: 13:50:39 time: 0.3935 data_time: 0.0017 memory: 20334 grad_norm: 0.0867 loss: 0.6898 2023/05/31 19:43:34 - mmengine - INFO - Epoch(train) [2][3200/5758] lr: 9.9391e-04 eta: 13:48:50 time: 0.4575 data_time: 0.0016 memory: 20334 grad_norm: 0.1330 loss: 0.6889 2023/05/31 19:43:53 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 19:44:17 - mmengine - INFO - Epoch(train) [2][3300/5758] lr: 9.9391e-04 eta: 13:47:20 time: 0.4360 data_time: 0.0018 memory: 20334 grad_norm: 0.0948 loss: 0.6842 2023/05/31 19:44:59 - mmengine - INFO - Epoch(train) [2][3400/5758] lr: 9.9391e-04 eta: 13:45:46 time: 0.4034 data_time: 0.0017 memory: 20334 grad_norm: 0.1201 loss: 0.6888 2023/05/31 19:45:39 - mmengine - INFO - Epoch(train) [2][3500/5758] lr: 9.9391e-04 eta: 13:43:43 time: 0.3864 data_time: 0.0017 memory: 20334 grad_norm: 0.1380 loss: 0.6826 2023/05/31 19:46:19 - mmengine - INFO - Epoch(train) [2][3600/5758] lr: 9.9391e-04 eta: 13:41:41 time: 0.4217 data_time: 0.0016 memory: 20334 grad_norm: 0.0715 loss: 0.6875 2023/05/31 19:46:58 - mmengine - INFO - Epoch(train) [2][3700/5758] lr: 9.9391e-04 eta: 13:39:27 time: 0.4025 data_time: 0.0016 memory: 20334 grad_norm: 0.0821 loss: 0.6869 2023/05/31 19:47:38 - mmengine - INFO - Epoch(train) [2][3800/5758] lr: 9.9391e-04 eta: 13:37:21 time: 0.3974 data_time: 0.0014 memory: 20334 grad_norm: 0.0815 loss: 0.6842 2023/05/31 19:48:17 - mmengine - INFO - Epoch(train) [2][3900/5758] lr: 9.9391e-04 eta: 13:35:14 time: 0.4269 data_time: 0.0016 memory: 20334 grad_norm: 0.0614 loss: 0.6825 2023/05/31 19:48:58 - mmengine - INFO - Epoch(train) [2][4000/5758] lr: 9.9391e-04 eta: 13:33:29 time: 0.4035 data_time: 0.0018 memory: 20334 grad_norm: 0.0564 loss: 0.6930 2023/05/31 19:49:36 - mmengine - INFO - Epoch(train) [2][4100/5758] lr: 9.9391e-04 eta: 13:31:19 time: 0.4068 data_time: 0.0016 memory: 20334 grad_norm: 0.0691 loss: 0.6821 2023/05/31 19:50:15 - mmengine - INFO - Epoch(train) [2][4200/5758] lr: 9.9391e-04 eta: 13:29:18 time: 0.4096 data_time: 0.0016 memory: 20334 grad_norm: 0.0648 loss: 0.6846 2023/05/31 19:50:32 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 19:50:55 - mmengine - INFO - Epoch(train) [2][4300/5758] lr: 9.9391e-04 eta: 13:27:23 time: 0.4189 data_time: 0.0018 memory: 20334 grad_norm: 0.0824 loss: 0.6868 2023/05/31 19:51:35 - mmengine - INFO - Epoch(train) [2][4400/5758] lr: 9.9391e-04 eta: 13:25:34 time: 0.4179 data_time: 0.0019 memory: 20334 grad_norm: 0.0776 loss: 0.6857 2023/05/31 19:52:15 - mmengine - INFO - Epoch(train) [2][4500/5758] lr: 9.9391e-04 eta: 13:23:51 time: 0.4282 data_time: 0.0015 memory: 20334 grad_norm: 0.0703 loss: 0.6881 2023/05/31 19:52:56 - mmengine - INFO - Epoch(train) [2][4600/5758] lr: 9.9391e-04 eta: 13:22:08 time: 0.3999 data_time: 0.0016 memory: 20334 grad_norm: 0.0567 loss: 0.6862 2023/05/31 19:53:35 - mmengine - INFO - Epoch(train) [2][4700/5758] lr: 9.9391e-04 eta: 13:20:16 time: 0.3663 data_time: 0.0019 memory: 20334 grad_norm: 0.0345 loss: 0.6859 2023/05/31 19:54:14 - mmengine - INFO - Epoch(train) [2][4800/5758] lr: 9.9391e-04 eta: 13:18:27 time: 0.3887 data_time: 0.0015 memory: 20334 grad_norm: 0.0623 loss: 0.6874 2023/05/31 19:54:56 - mmengine - INFO - Epoch(train) [2][4900/5758] lr: 9.9391e-04 eta: 13:17:01 time: 0.4403 data_time: 0.0027 memory: 20334 grad_norm: 0.0594 loss: 0.6838 2023/05/31 19:55:36 - mmengine - INFO - Epoch(train) [2][5000/5758] lr: 9.9391e-04 eta: 13:15:20 time: 0.3890 data_time: 0.0016 memory: 20334 grad_norm: 0.0604 loss: 0.6927 2023/05/31 19:56:18 - mmengine - INFO - Epoch(train) [2][5100/5758] lr: 9.9391e-04 eta: 13:13:55 time: 0.4493 data_time: 0.0017 memory: 20334 grad_norm: 0.0538 loss: 0.6867 2023/05/31 19:56:59 - mmengine - INFO - Epoch(train) [2][5200/5758] lr: 9.9391e-04 eta: 13:12:23 time: 0.4407 data_time: 0.0017 memory: 20334 grad_norm: 0.0643 loss: 0.6885 2023/05/31 19:57:16 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 19:57:39 - mmengine - INFO - Epoch(train) [2][5300/5758] lr: 9.9391e-04 eta: 13:10:52 time: 0.3635 data_time: 0.0022 memory: 20334 grad_norm: 0.0312 loss: 0.6879 2023/05/31 19:58:21 - mmengine - INFO - Epoch(train) [2][5400/5758] lr: 9.9391e-04 eta: 13:09:27 time: 0.4347 data_time: 0.0023 memory: 20334 grad_norm: 0.0441 loss: 0.6896 2023/05/31 19:59:03 - mmengine - INFO - Epoch(train) [2][5500/5758] lr: 9.9391e-04 eta: 13:08:13 time: 0.4383 data_time: 0.0019 memory: 20334 grad_norm: 0.0362 loss: 0.6920 2023/05/31 19:59:45 - mmengine - INFO - Epoch(train) [2][5600/5758] lr: 9.9391e-04 eta: 13:06:57 time: 0.3910 data_time: 0.0016 memory: 20334 grad_norm: 0.0308 loss: 0.6834 2023/05/31 20:00:26 - mmengine - INFO - Epoch(train) [2][5700/5758] lr: 9.9391e-04 eta: 13:05:30 time: 0.3994 data_time: 0.0016 memory: 20334 grad_norm: 0.0362 loss: 0.6887 2023/05/31 20:00:51 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 20:00:51 - mmengine - INFO - Saving checkpoint at 2 epochs 2023/05/31 20:01:08 - mmengine - INFO - Epoch(val) [2][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.3736 time: 1.0013 2023/05/31 20:01:54 - mmengine - INFO - Epoch(train) [3][ 100/5758] lr: 9.7577e-04 eta: 13:04:09 time: 0.4018 data_time: 0.0015 memory: 20334 grad_norm: 0.0310 loss: 0.6865 2023/05/31 20:02:36 - mmengine - INFO - Epoch(train) [3][ 200/5758] lr: 9.7577e-04 eta: 13:02:53 time: 0.3977 data_time: 0.0015 memory: 20334 grad_norm: 0.0339 loss: 0.6903 2023/05/31 20:03:18 - mmengine - INFO - Epoch(train) [3][ 300/5758] lr: 9.7577e-04 eta: 13:01:41 time: 0.3921 data_time: 0.0015 memory: 20334 grad_norm: 0.0301 loss: 0.6797 2023/05/31 20:03:59 - mmengine - INFO - Epoch(train) [3][ 400/5758] lr: 9.7577e-04 eta: 13:00:18 time: 0.4322 data_time: 0.0016 memory: 20334 grad_norm: 0.0222 loss: 0.6907 2023/05/31 20:04:34 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 20:04:40 - mmengine - INFO - Epoch(train) [3][ 500/5758] lr: 9.7577e-04 eta: 12:58:54 time: 0.3707 data_time: 0.0014 memory: 20334 grad_norm: 0.0252 loss: 0.6878 2023/05/31 20:05:21 - mmengine - INFO - Epoch(train) [3][ 600/5758] lr: 9.7577e-04 eta: 12:57:29 time: 0.4078 data_time: 0.0014 memory: 20334 grad_norm: 0.0266 loss: 0.6892 2023/05/31 20:06:03 - mmengine - INFO - Epoch(train) [3][ 700/5758] lr: 9.7577e-04 eta: 12:56:16 time: 0.4164 data_time: 0.0014 memory: 20334 grad_norm: 0.0289 loss: 0.6871 2023/05/31 20:06:46 - mmengine - INFO - Epoch(train) [3][ 800/5758] lr: 9.7577e-04 eta: 12:55:19 time: 0.4127 data_time: 0.0015 memory: 20334 grad_norm: 0.0276 loss: 0.6887 2023/05/31 20:07:31 - mmengine - INFO - Epoch(train) [3][ 900/5758] lr: 9.7577e-04 eta: 12:54:27 time: 0.4864 data_time: 0.0014 memory: 20334 grad_norm: 0.0165 loss: 0.6819 2023/05/31 20:08:17 - mmengine - INFO - Epoch(train) [3][1000/5758] lr: 9.7577e-04 eta: 12:53:46 time: 0.4091 data_time: 0.0016 memory: 20334 grad_norm: 0.0219 loss: 0.6840 2023/05/31 20:09:03 - mmengine - INFO - Epoch(train) [3][1100/5758] lr: 9.7577e-04 eta: 12:53:09 time: 0.4353 data_time: 0.0014 memory: 20334 grad_norm: 0.0235 loss: 0.6887 2023/05/31 20:09:47 - mmengine - INFO - Epoch(train) [3][1200/5758] lr: 9.7577e-04 eta: 12:52:14 time: 0.4666 data_time: 0.0014 memory: 20334 grad_norm: 0.0147 loss: 0.6851 2023/05/31 20:10:34 - mmengine - INFO - Epoch(train) [3][1300/5758] lr: 9.7577e-04 eta: 12:51:40 time: 0.4219 data_time: 0.0014 memory: 20334 grad_norm: 0.0252 loss: 0.6859 2023/05/31 20:11:17 - mmengine - INFO - Epoch(train) [3][1400/5758] lr: 9.7577e-04 eta: 12:50:37 time: 0.4625 data_time: 0.0015 memory: 20334 grad_norm: 0.0157 loss: 0.6885 2023/05/31 20:11:54 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 20:12:01 - mmengine - INFO - Epoch(train) [3][1500/5758] lr: 9.7577e-04 eta: 12:49:44 time: 0.4381 data_time: 0.0014 memory: 20334 grad_norm: 0.0259 loss: 0.6954 2023/05/31 20:12:45 - mmengine - INFO - Epoch(train) [3][1600/5758] lr: 9.7577e-04 eta: 12:48:48 time: 0.4287 data_time: 0.0023 memory: 20334 grad_norm: 0.0237 loss: 0.6878 2023/05/31 20:13:30 - mmengine - INFO - Epoch(train) [3][1700/5758] lr: 9.7577e-04 eta: 12:48:03 time: 0.4157 data_time: 0.0018 memory: 20334 grad_norm: 0.0196 loss: 0.6908 2023/05/31 20:14:14 - mmengine - INFO - Epoch(train) [3][1800/5758] lr: 9.7577e-04 eta: 12:47:07 time: 0.4315 data_time: 0.0019 memory: 20334 grad_norm: 0.0223 loss: 0.6813 2023/05/31 20:14:59 - mmengine - INFO - Epoch(train) [3][1900/5758] lr: 9.7577e-04 eta: 12:46:20 time: 0.4805 data_time: 0.0016 memory: 20334 grad_norm: 0.0273 loss: 0.6873 2023/05/31 20:15:43 - mmengine - INFO - Epoch(train) [3][2000/5758] lr: 9.7577e-04 eta: 12:45:30 time: 0.4310 data_time: 0.0015 memory: 20334 grad_norm: 0.0265 loss: 0.6899 2023/05/31 20:16:26 - mmengine - INFO - Epoch(train) [3][2100/5758] lr: 9.7577e-04 eta: 12:44:28 time: 0.4255 data_time: 0.0015 memory: 20334 grad_norm: 0.0124 loss: 0.6818 2023/05/31 20:17:10 - mmengine - INFO - Epoch(train) [3][2200/5758] lr: 9.7577e-04 eta: 12:43:30 time: 0.4454 data_time: 0.0020 memory: 20334 grad_norm: 0.0120 loss: 0.6839 2023/05/31 20:17:54 - mmengine - INFO - Epoch(train) [3][2300/5758] lr: 9.7577e-04 eta: 12:42:41 time: 0.4625 data_time: 0.0014 memory: 20334 grad_norm: 0.0236 loss: 0.6908 2023/05/31 20:18:37 - mmengine - INFO - Epoch(train) [3][2400/5758] lr: 9.7577e-04 eta: 12:41:41 time: 0.4348 data_time: 0.0016 memory: 20334 grad_norm: 0.0181 loss: 0.6885 2023/05/31 20:19:16 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 20:19:22 - mmengine - INFO - Epoch(train) [3][2500/5758] lr: 9.7577e-04 eta: 12:40:54 time: 0.4458 data_time: 0.0022 memory: 20334 grad_norm: 0.0273 loss: 0.6876 2023/05/31 20:20:06 - mmengine - INFO - Epoch(train) [3][2600/5758] lr: 9.7577e-04 eta: 12:40:01 time: 0.3965 data_time: 0.0015 memory: 20334 grad_norm: 0.0133 loss: 0.6822 2023/05/31 20:20:50 - mmengine - INFO - Epoch(train) [3][2700/5758] lr: 9.7577e-04 eta: 12:39:03 time: 0.4620 data_time: 0.0013 memory: 20334 grad_norm: 0.0275 loss: 0.6798 2023/05/31 20:21:36 - mmengine - INFO - Epoch(train) [3][2800/5758] lr: 9.7577e-04 eta: 12:38:26 time: 0.4705 data_time: 0.0019 memory: 20334 grad_norm: 0.0217 loss: 0.6842 2023/05/31 20:22:20 - mmengine - INFO - Epoch(train) [3][2900/5758] lr: 9.7577e-04 eta: 12:37:34 time: 0.4299 data_time: 0.0017 memory: 20334 grad_norm: 0.0169 loss: 0.6871 2023/05/31 20:23:03 - mmengine - INFO - Epoch(train) [3][3000/5758] lr: 9.7577e-04 eta: 12:36:34 time: 0.4040 data_time: 0.0015 memory: 20334 grad_norm: 0.0239 loss: 0.6868 2023/05/31 20:23:46 - mmengine - INFO - Epoch(train) [3][3100/5758] lr: 9.7577e-04 eta: 12:35:34 time: 0.3936 data_time: 0.0017 memory: 20334 grad_norm: 0.0167 loss: 0.6868 2023/05/31 20:24:31 - mmengine - INFO - Epoch(train) [3][3200/5758] lr: 9.7577e-04 eta: 12:34:47 time: 0.4512 data_time: 0.0015 memory: 20334 grad_norm: 0.0154 loss: 0.6861 2023/05/31 20:25:15 - mmengine - INFO - Epoch(train) [3][3300/5758] lr: 9.7577e-04 eta: 12:33:56 time: 0.4681 data_time: 0.0016 memory: 20334 grad_norm: 0.0145 loss: 0.6834 2023/05/31 20:26:01 - mmengine - INFO - Epoch(train) [3][3400/5758] lr: 9.7577e-04 eta: 12:33:19 time: 0.4666 data_time: 0.0016 memory: 20334 grad_norm: 0.0221 loss: 0.6830 2023/05/31 20:26:38 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 20:26:44 - mmengine - INFO - Epoch(train) [3][3500/5758] lr: 9.7577e-04 eta: 12:32:20 time: 0.4276 data_time: 0.0025 memory: 20334 grad_norm: 0.0118 loss: 0.6831 2023/05/31 20:27:27 - mmengine - INFO - Epoch(train) [3][3600/5758] lr: 9.7577e-04 eta: 12:31:23 time: 0.3959 data_time: 0.0022 memory: 20334 grad_norm: 0.0161 loss: 0.6844 2023/05/31 20:28:12 - mmengine - INFO - Epoch(train) [3][3700/5758] lr: 9.7577e-04 eta: 12:30:34 time: 0.3996 data_time: 0.0014 memory: 20334 grad_norm: 0.0217 loss: 0.6889 2023/05/31 20:28:57 - mmengine - INFO - Epoch(train) [3][3800/5758] lr: 9.7577e-04 eta: 12:29:49 time: 0.4285 data_time: 0.0013 memory: 20334 grad_norm: 0.0112 loss: 0.6836 2023/05/31 20:29:41 - mmengine - INFO - Epoch(train) [3][3900/5758] lr: 9.7577e-04 eta: 12:29:00 time: 0.4196 data_time: 0.0020 memory: 20334 grad_norm: 0.0170 loss: 0.6858 2023/05/31 20:30:26 - mmengine - INFO - Epoch(train) [3][4000/5758] lr: 9.7577e-04 eta: 12:28:09 time: 0.4298 data_time: 0.0015 memory: 20334 grad_norm: 0.0314 loss: 0.6842 2023/05/31 20:31:09 - mmengine - INFO - Epoch(train) [3][4100/5758] lr: 9.7577e-04 eta: 12:27:10 time: 0.4046 data_time: 0.0021 memory: 20334 grad_norm: 0.0170 loss: 0.6856 2023/05/31 20:31:53 - mmengine - INFO - Epoch(train) [3][4200/5758] lr: 9.7577e-04 eta: 12:26:19 time: 0.4321 data_time: 0.0015 memory: 20334 grad_norm: 0.0161 loss: 0.6863 2023/05/31 20:32:35 - mmengine - INFO - Epoch(train) [3][4300/5758] lr: 9.7577e-04 eta: 12:25:20 time: 0.4370 data_time: 0.0015 memory: 20334 grad_norm: 0.0134 loss: 0.6926 2023/05/31 20:33:21 - mmengine - INFO - Epoch(train) [3][4400/5758] lr: 9.7577e-04 eta: 12:24:38 time: 0.4092 data_time: 0.0459 memory: 20334 grad_norm: 0.0179 loss: 0.6789 2023/05/31 20:33:57 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 20:34:05 - mmengine - INFO - Epoch(train) [3][4500/5758] lr: 9.7577e-04 eta: 12:23:46 time: 0.4261 data_time: 0.0647 memory: 20334 grad_norm: 0.0180 loss: 0.6856 2023/05/31 20:34:49 - mmengine - INFO - Epoch(train) [3][4600/5758] lr: 9.7577e-04 eta: 12:22:55 time: 0.4407 data_time: 0.0775 memory: 20334 grad_norm: 0.0139 loss: 0.6850 2023/05/31 20:35:33 - mmengine - INFO - Epoch(train) [3][4700/5758] lr: 9.7577e-04 eta: 12:22:03 time: 0.4256 data_time: 0.0435 memory: 20334 grad_norm: 0.0240 loss: 0.6887 2023/05/31 20:36:16 - mmengine - INFO - Epoch(train) [3][4800/5758] lr: 9.7577e-04 eta: 12:21:10 time: 0.4609 data_time: 0.0015 memory: 20334 grad_norm: 0.0186 loss: 0.6877 2023/05/31 20:37:01 - mmengine - INFO - Epoch(train) [3][4900/5758] lr: 9.7577e-04 eta: 12:20:20 time: 0.4639 data_time: 0.0017 memory: 20334 grad_norm: 0.0175 loss: 0.6846 2023/05/31 20:37:44 - mmengine - INFO - Epoch(train) [3][5000/5758] lr: 9.7577e-04 eta: 12:19:28 time: 0.4122 data_time: 0.0012 memory: 20334 grad_norm: 0.0167 loss: 0.6921 2023/05/31 20:38:27 - mmengine - INFO - Epoch(train) [3][5100/5758] lr: 9.7577e-04 eta: 12:18:29 time: 0.4096 data_time: 0.0013 memory: 20334 grad_norm: 0.0165 loss: 0.6891 2023/05/31 20:39:11 - mmengine - INFO - Epoch(train) [3][5200/5758] lr: 9.7577e-04 eta: 12:17:38 time: 0.4865 data_time: 0.0020 memory: 20334 grad_norm: 0.0147 loss: 0.6927 2023/05/31 20:39:53 - mmengine - INFO - Epoch(train) [3][5300/5758] lr: 9.7577e-04 eta: 12:16:38 time: 0.4180 data_time: 0.0017 memory: 20334 grad_norm: 0.0146 loss: 0.6871 2023/05/31 20:40:37 - mmengine - INFO - Epoch(train) [3][5400/5758] lr: 9.7577e-04 eta: 12:15:43 time: 0.4236 data_time: 0.0023 memory: 20334 grad_norm: 0.0146 loss: 0.6873 2023/05/31 20:41:15 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 20:41:22 - mmengine - INFO - Epoch(train) [3][5500/5758] lr: 9.7577e-04 eta: 12:14:59 time: 0.4132 data_time: 0.0020 memory: 20334 grad_norm: 0.0168 loss: 0.6844 2023/05/31 20:42:06 - mmengine - INFO - Epoch(train) [3][5600/5758] lr: 9.7577e-04 eta: 12:14:13 time: 0.4342 data_time: 0.0019 memory: 20334 grad_norm: 0.0145 loss: 0.6877 2023/05/31 20:42:49 - mmengine - INFO - Epoch(train) [3][5700/5758] lr: 9.7577e-04 eta: 12:13:17 time: 0.4276 data_time: 0.0016 memory: 20334 grad_norm: 0.0230 loss: 0.6922 2023/05/31 20:43:14 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 20:43:14 - mmengine - INFO - Saving checkpoint at 3 epochs 2023/05/31 20:43:31 - mmengine - INFO - Epoch(val) [3][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.3651 time: 0.9904 2023/05/31 20:44:19 - mmengine - INFO - Epoch(train) [4][ 100/5758] lr: 9.4605e-04 eta: 12:12:19 time: 0.3723 data_time: 0.0018 memory: 20334 grad_norm: 0.0164 loss: 0.6823 2023/05/31 20:45:04 - mmengine - INFO - Epoch(train) [4][ 200/5758] lr: 9.4605e-04 eta: 12:11:34 time: 0.4839 data_time: 0.0016 memory: 20334 grad_norm: 0.0149 loss: 0.6836 2023/05/31 20:45:46 - mmengine - INFO - Epoch(train) [4][ 300/5758] lr: 9.4605e-04 eta: 12:10:31 time: 0.4000 data_time: 0.0018 memory: 20334 grad_norm: 0.0210 loss: 0.6859 2023/05/31 20:46:30 - mmengine - INFO - Epoch(train) [4][ 400/5758] lr: 9.4605e-04 eta: 12:09:42 time: 0.4330 data_time: 0.0017 memory: 20334 grad_norm: 0.0166 loss: 0.6917 2023/05/31 20:47:13 - mmengine - INFO - Epoch(train) [4][ 500/5758] lr: 9.4605e-04 eta: 12:08:48 time: 0.4329 data_time: 0.0018 memory: 20334 grad_norm: 0.0118 loss: 0.6859 2023/05/31 20:47:58 - mmengine - INFO - Epoch(train) [4][ 600/5758] lr: 9.4605e-04 eta: 12:08:00 time: 0.4122 data_time: 0.0017 memory: 20334 grad_norm: 0.0178 loss: 0.6884 2023/05/31 20:48:41 - mmengine - INFO - Epoch(train) [4][ 700/5758] lr: 9.4605e-04 eta: 12:07:05 time: 0.4330 data_time: 0.0016 memory: 20334 grad_norm: 0.0183 loss: 0.6913 2023/05/31 20:48:53 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 20:49:24 - mmengine - INFO - Epoch(train) [4][ 800/5758] lr: 9.4605e-04 eta: 12:06:13 time: 0.4226 data_time: 0.0017 memory: 20334 grad_norm: 0.0258 loss: 0.6846 2023/05/31 20:50:07 - mmengine - INFO - Epoch(train) [4][ 900/5758] lr: 9.4605e-04 eta: 12:05:18 time: 0.4490 data_time: 0.0016 memory: 20334 grad_norm: 0.0205 loss: 0.6850 2023/05/31 20:50:51 - mmengine - INFO - Epoch(train) [4][1000/5758] lr: 9.4605e-04 eta: 12:04:25 time: 0.3926 data_time: 0.0018 memory: 20334 grad_norm: 0.0270 loss: 0.6846 2023/05/31 20:51:35 - mmengine - INFO - Epoch(train) [4][1100/5758] lr: 9.4605e-04 eta: 12:03:36 time: 0.4475 data_time: 0.0016 memory: 20334 grad_norm: 0.0234 loss: 0.6881 2023/05/31 20:52:18 - mmengine - INFO - Epoch(train) [4][1200/5758] lr: 9.4605e-04 eta: 12:02:40 time: 0.4378 data_time: 0.0019 memory: 20334 grad_norm: 0.0179 loss: 0.6856 2023/05/31 20:53:01 - mmengine - INFO - Epoch(train) [4][1300/5758] lr: 9.4605e-04 eta: 12:01:46 time: 0.4295 data_time: 0.0015 memory: 20334 grad_norm: 0.0126 loss: 0.6849 2023/05/31 20:53:45 - mmengine - INFO - Epoch(train) [4][1400/5758] lr: 9.4605e-04 eta: 12:00:57 time: 0.4935 data_time: 0.0018 memory: 20334 grad_norm: 0.0144 loss: 0.6878 2023/05/31 20:54:27 - mmengine - INFO - Epoch(train) [4][1500/5758] lr: 9.4605e-04 eta: 12:00:00 time: 0.4450 data_time: 0.0019 memory: 20334 grad_norm: 0.0175 loss: 0.6796 2023/05/31 20:55:11 - mmengine - INFO - Epoch(train) [4][1600/5758] lr: 9.4605e-04 eta: 11:59:09 time: 0.3642 data_time: 0.0019 memory: 20334 grad_norm: 0.0237 loss: 0.6832 2023/05/31 20:55:54 - mmengine - INFO - Epoch(train) [4][1700/5758] lr: 9.4605e-04 eta: 11:58:16 time: 0.4355 data_time: 0.0015 memory: 20334 grad_norm: 0.0191 loss: 0.6884 2023/05/31 20:56:05 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 20:56:37 - mmengine - INFO - Epoch(train) [4][1800/5758] lr: 9.4605e-04 eta: 11:57:21 time: 0.3870 data_time: 0.0015 memory: 20334 grad_norm: 0.0135 loss: 0.6882 2023/05/31 20:57:19 - mmengine - INFO - Epoch(train) [4][1900/5758] lr: 9.4605e-04 eta: 11:56:26 time: 0.3704 data_time: 0.0014 memory: 20334 grad_norm: 0.0217 loss: 0.6863 2023/05/31 20:58:02 - mmengine - INFO - Epoch(train) [4][2000/5758] lr: 9.4605e-04 eta: 11:55:33 time: 0.4416 data_time: 0.0015 memory: 20334 grad_norm: 0.0081 loss: 0.6860 2023/05/31 20:58:45 - mmengine - INFO - Epoch(train) [4][2100/5758] lr: 9.4605e-04 eta: 11:54:36 time: 0.4142 data_time: 0.0017 memory: 20334 grad_norm: 0.0168 loss: 0.6825 2023/05/31 20:59:28 - mmengine - INFO - Epoch(train) [4][2200/5758] lr: 9.4605e-04 eta: 11:53:45 time: 0.4347 data_time: 0.0018 memory: 20334 grad_norm: 0.0182 loss: 0.6867 2023/05/31 21:00:13 - mmengine - INFO - Epoch(train) [4][2300/5758] lr: 9.4605e-04 eta: 11:52:59 time: 0.4444 data_time: 0.0015 memory: 20334 grad_norm: 0.0137 loss: 0.6875 2023/05/31 21:00:55 - mmengine - INFO - Epoch(train) [4][2400/5758] lr: 9.4605e-04 eta: 11:52:01 time: 0.4392 data_time: 0.0014 memory: 20334 grad_norm: 0.0134 loss: 0.6847 2023/05/31 21:01:37 - mmengine - INFO - Epoch(train) [4][2500/5758] lr: 9.4605e-04 eta: 11:51:04 time: 0.4118 data_time: 0.0015 memory: 20334 grad_norm: 0.0210 loss: 0.6840 2023/05/31 21:02:21 - mmengine - INFO - Epoch(train) [4][2600/5758] lr: 9.4605e-04 eta: 11:50:16 time: 0.4177 data_time: 0.0023 memory: 20334 grad_norm: 0.0131 loss: 0.6845 2023/05/31 21:03:02 - mmengine - INFO - Epoch(train) [4][2700/5758] lr: 9.4605e-04 eta: 11:49:13 time: 0.4065 data_time: 0.0017 memory: 20334 grad_norm: 0.0162 loss: 0.6870 2023/05/31 21:03:13 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 21:03:45 - mmengine - INFO - Epoch(train) [4][2800/5758] lr: 9.4605e-04 eta: 11:48:19 time: 0.4555 data_time: 0.0017 memory: 20334 grad_norm: 0.0206 loss: 0.6928 2023/05/31 21:04:28 - mmengine - INFO - Epoch(train) [4][2900/5758] lr: 9.4605e-04 eta: 11:47:27 time: 0.4271 data_time: 0.0016 memory: 20334 grad_norm: 0.0206 loss: 0.6828 2023/05/31 21:05:11 - mmengine - INFO - Epoch(train) [4][3000/5758] lr: 9.4605e-04 eta: 11:46:35 time: 0.4085 data_time: 0.0021 memory: 20334 grad_norm: 0.0146 loss: 0.6862 2023/05/31 21:05:54 - mmengine - INFO - Epoch(train) [4][3100/5758] lr: 9.4605e-04 eta: 11:45:41 time: 0.4297 data_time: 0.0016 memory: 20334 grad_norm: 0.0215 loss: 0.6816 2023/05/31 21:06:36 - mmengine - INFO - Epoch(train) [4][3200/5758] lr: 9.4605e-04 eta: 11:44:44 time: 0.4052 data_time: 0.0016 memory: 20334 grad_norm: 0.0207 loss: 0.6868 2023/05/31 21:07:20 - mmengine - INFO - Epoch(train) [4][3300/5758] lr: 9.4605e-04 eta: 11:44:00 time: 0.4284 data_time: 0.0016 memory: 20334 grad_norm: 0.0207 loss: 0.6845 2023/05/31 21:08:05 - mmengine - INFO - Epoch(train) [4][3400/5758] lr: 9.4605e-04 eta: 11:43:13 time: 0.5038 data_time: 0.0015 memory: 20334 grad_norm: 0.0181 loss: 0.6863 2023/05/31 21:08:48 - mmengine - INFO - Epoch(train) [4][3500/5758] lr: 9.4605e-04 eta: 11:42:20 time: 0.4160 data_time: 0.0016 memory: 20334 grad_norm: 0.0179 loss: 0.6884 2023/05/31 21:09:29 - mmengine - INFO - Epoch(train) [4][3600/5758] lr: 9.4605e-04 eta: 11:41:22 time: 0.4228 data_time: 0.0023 memory: 20334 grad_norm: 0.0198 loss: 0.6881 2023/05/31 21:10:11 - mmengine - INFO - Epoch(train) [4][3700/5758] lr: 9.4605e-04 eta: 11:40:26 time: 0.4068 data_time: 0.0012 memory: 20334 grad_norm: 0.0236 loss: 0.6876 2023/05/31 21:10:23 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 21:10:55 - mmengine - INFO - Epoch(train) [4][3800/5758] lr: 9.4605e-04 eta: 11:39:37 time: 0.4778 data_time: 0.0018 memory: 20334 grad_norm: 0.0132 loss: 0.6891 2023/05/31 21:11:39 - mmengine - INFO - Epoch(train) [4][3900/5758] lr: 9.4605e-04 eta: 11:38:50 time: 0.4253 data_time: 0.0021 memory: 20334 grad_norm: 0.0147 loss: 0.6857 2023/05/31 21:12:23 - mmengine - INFO - Epoch(train) [4][4000/5758] lr: 9.4605e-04 eta: 11:38:04 time: 0.4027 data_time: 0.0020 memory: 20334 grad_norm: 0.0128 loss: 0.6858 2023/05/31 21:13:06 - mmengine - INFO - Epoch(train) [4][4100/5758] lr: 9.4605e-04 eta: 11:37:09 time: 0.4351 data_time: 0.0017 memory: 20334 grad_norm: 0.0162 loss: 0.6872 2023/05/31 21:13:48 - mmengine - INFO - Epoch(train) [4][4200/5758] lr: 9.4605e-04 eta: 11:36:14 time: 0.3973 data_time: 0.0018 memory: 20334 grad_norm: 0.0114 loss: 0.6922 2023/05/31 21:14:30 - mmengine - INFO - Epoch(train) [4][4300/5758] lr: 9.4605e-04 eta: 11:35:18 time: 0.4285 data_time: 0.0018 memory: 20334 grad_norm: 0.0213 loss: 0.6886 2023/05/31 21:15:12 - mmengine - INFO - Epoch(train) [4][4400/5758] lr: 9.4605e-04 eta: 11:34:25 time: 0.4498 data_time: 0.0017 memory: 20334 grad_norm: 0.0167 loss: 0.6875 2023/05/31 21:15:55 - mmengine - INFO - Epoch(train) [4][4500/5758] lr: 9.4605e-04 eta: 11:33:30 time: 0.4027 data_time: 0.0018 memory: 20334 grad_norm: 0.0458 loss: 0.6877 2023/05/31 21:16:38 - mmengine - INFO - Epoch(train) [4][4600/5758] lr: 9.4605e-04 eta: 11:32:41 time: 0.4618 data_time: 0.0017 memory: 20334 grad_norm: 0.0162 loss: 0.6881 2023/05/31 21:17:19 - mmengine - INFO - Epoch(train) [4][4700/5758] lr: 9.4605e-04 eta: 11:31:43 time: 0.4060 data_time: 0.0017 memory: 20334 grad_norm: 0.0261 loss: 0.6868 2023/05/31 21:17:31 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 21:18:03 - mmengine - INFO - Epoch(train) [4][4800/5758] lr: 9.4605e-04 eta: 11:30:52 time: 0.4904 data_time: 0.0020 memory: 20334 grad_norm: 0.0265 loss: 0.6937 2023/05/31 21:18:44 - mmengine - INFO - Epoch(train) [4][4900/5758] lr: 9.4605e-04 eta: 11:29:53 time: 0.3855 data_time: 0.0020 memory: 20334 grad_norm: 0.0134 loss: 0.6906 2023/05/31 21:19:27 - mmengine - INFO - Epoch(train) [4][5000/5758] lr: 9.4605e-04 eta: 11:29:05 time: 0.4084 data_time: 0.0027 memory: 20334 grad_norm: 0.0191 loss: 0.6885 2023/05/31 21:20:11 - mmengine - INFO - Epoch(train) [4][5100/5758] lr: 9.4605e-04 eta: 11:28:15 time: 0.4244 data_time: 0.0017 memory: 20334 grad_norm: 0.0107 loss: 0.6882 2023/05/31 21:20:54 - mmengine - INFO - Epoch(train) [4][5200/5758] lr: 9.4605e-04 eta: 11:27:28 time: 0.4063 data_time: 0.0021 memory: 20334 grad_norm: 0.0165 loss: 0.6871 2023/05/31 21:21:38 - mmengine - INFO - Epoch(train) [4][5300/5758] lr: 9.4605e-04 eta: 11:26:38 time: 0.4249 data_time: 0.0021 memory: 20334 grad_norm: 0.0172 loss: 0.6871 2023/05/31 21:22:20 - mmengine - INFO - Epoch(train) [4][5400/5758] lr: 9.4605e-04 eta: 11:25:46 time: 0.4152 data_time: 0.0017 memory: 20334 grad_norm: 0.0170 loss: 0.6895 2023/05/31 21:23:03 - mmengine - INFO - Epoch(train) [4][5500/5758] lr: 9.4605e-04 eta: 11:24:54 time: 0.4291 data_time: 0.0018 memory: 20334 grad_norm: 0.0172 loss: 0.6827 2023/05/31 21:23:45 - mmengine - INFO - Epoch(train) [4][5600/5758] lr: 9.4605e-04 eta: 11:24:02 time: 0.4585 data_time: 0.0016 memory: 20334 grad_norm: 0.0167 loss: 0.6921 2023/05/31 21:24:29 - mmengine - INFO - Epoch(train) [4][5700/5758] lr: 9.4605e-04 eta: 11:23:15 time: 0.4407 data_time: 0.0019 memory: 20334 grad_norm: 0.0169 loss: 0.6857 2023/05/31 21:24:40 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 21:24:53 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 21:24:53 - mmengine - INFO - Saving checkpoint at 4 epochs 2023/05/31 21:25:11 - mmengine - INFO - Epoch(val) [4][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.3748 time: 1.0005 2023/05/31 21:26:00 - mmengine - INFO - Epoch(train) [5][ 100/5758] lr: 9.0546e-04 eta: 11:22:13 time: 0.4626 data_time: 0.0017 memory: 20334 grad_norm: 0.0135 loss: 0.6892 2023/05/31 21:26:46 - mmengine - INFO - Epoch(train) [5][ 200/5758] lr: 9.0546e-04 eta: 11:21:33 time: 0.5576 data_time: 0.0018 memory: 20334 grad_norm: 0.0138 loss: 0.6833 2023/05/31 21:27:29 - mmengine - INFO - Epoch(train) [5][ 300/5758] lr: 9.0546e-04 eta: 11:20:46 time: 0.4598 data_time: 0.0018 memory: 20334 grad_norm: 0.0153 loss: 0.6862 2023/05/31 21:28:12 - mmengine - INFO - Epoch(train) [5][ 400/5758] lr: 9.0546e-04 eta: 11:19:56 time: 0.4185 data_time: 0.0015 memory: 20334 grad_norm: 0.0148 loss: 0.6823 2023/05/31 21:28:56 - mmengine - INFO - Epoch(train) [5][ 500/5758] lr: 9.0546e-04 eta: 11:19:08 time: 0.4824 data_time: 0.0016 memory: 20334 grad_norm: 0.0160 loss: 0.6872 2023/05/31 21:29:39 - mmengine - INFO - Epoch(train) [5][ 600/5758] lr: 9.0546e-04 eta: 11:18:19 time: 0.4359 data_time: 0.0017 memory: 20334 grad_norm: 0.0169 loss: 0.6870 2023/05/31 21:30:24 - mmengine - INFO - Epoch(train) [5][ 700/5758] lr: 9.0546e-04 eta: 11:17:35 time: 0.4727 data_time: 0.0016 memory: 20334 grad_norm: 0.0149 loss: 0.6889 2023/05/31 21:31:08 - mmengine - INFO - Epoch(train) [5][ 800/5758] lr: 9.0546e-04 eta: 11:16:48 time: 0.3930 data_time: 0.0017 memory: 20334 grad_norm: 0.0179 loss: 0.6852 2023/05/31 21:31:51 - mmengine - INFO - Epoch(train) [5][ 900/5758] lr: 9.0546e-04 eta: 11:15:59 time: 0.4594 data_time: 0.0018 memory: 20334 grad_norm: 0.0157 loss: 0.6877 2023/05/31 21:32:21 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 21:32:35 - mmengine - INFO - Epoch(train) [5][1000/5758] lr: 9.0546e-04 eta: 11:15:14 time: 0.4944 data_time: 0.0013 memory: 20334 grad_norm: 0.0222 loss: 0.6885 2023/05/31 21:33:18 - mmengine - INFO - Epoch(train) [5][1100/5758] lr: 9.0546e-04 eta: 11:14:22 time: 0.4181 data_time: 0.0017 memory: 20334 grad_norm: 0.0180 loss: 0.6851 2023/05/31 21:34:03 - mmengine - INFO - Epoch(train) [5][1200/5758] lr: 9.0546e-04 eta: 11:13:41 time: 0.4728 data_time: 0.0016 memory: 20334 grad_norm: 0.0201 loss: 0.6893 2023/05/31 21:34:47 - mmengine - INFO - Epoch(train) [5][1300/5758] lr: 9.0546e-04 eta: 11:12:54 time: 0.4514 data_time: 0.0024 memory: 20334 grad_norm: 0.0127 loss: 0.6831 2023/05/31 21:35:31 - mmengine - INFO - Epoch(train) [5][1400/5758] lr: 9.0546e-04 eta: 11:12:08 time: 0.4225 data_time: 0.0016 memory: 20334 grad_norm: 0.0154 loss: 0.6874 2023/05/31 21:36:14 - mmengine - INFO - Epoch(train) [5][1500/5758] lr: 9.0546e-04 eta: 11:11:16 time: 0.4359 data_time: 0.0013 memory: 20334 grad_norm: 0.0270 loss: 0.6900 2023/05/31 21:36:58 - mmengine - INFO - Epoch(train) [5][1600/5758] lr: 9.0546e-04 eta: 11:10:34 time: 0.4071 data_time: 0.0019 memory: 20334 grad_norm: 0.0152 loss: 0.6914 2023/05/31 21:37:39 - mmengine - INFO - Epoch(train) [5][1700/5758] lr: 9.0546e-04 eta: 11:09:34 time: 0.4208 data_time: 0.0020 memory: 20334 grad_norm: 0.0142 loss: 0.6832 2023/05/31 21:38:21 - mmengine - INFO - Epoch(train) [5][1800/5758] lr: 9.0546e-04 eta: 11:08:40 time: 0.4055 data_time: 0.0018 memory: 20334 grad_norm: 0.0170 loss: 0.6816 2023/05/31 21:39:04 - mmengine - INFO - Epoch(train) [5][1900/5758] lr: 9.0546e-04 eta: 11:07:54 time: 0.4407 data_time: 0.0018 memory: 20334 grad_norm: 0.0153 loss: 0.6831 2023/05/31 21:39:33 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 21:39:47 - mmengine - INFO - Epoch(train) [5][2000/5758] lr: 9.0546e-04 eta: 11:07:02 time: 0.4170 data_time: 0.0020 memory: 20334 grad_norm: 0.0161 loss: 0.6893 2023/05/31 21:40:29 - mmengine - INFO - Epoch(train) [5][2100/5758] lr: 9.0546e-04 eta: 11:06:11 time: 0.4147 data_time: 0.0024 memory: 20334 grad_norm: 0.0106 loss: 0.6865 2023/05/31 21:41:11 - mmengine - INFO - Epoch(train) [5][2200/5758] lr: 9.0546e-04 eta: 11:05:18 time: 0.3871 data_time: 0.0017 memory: 20334 grad_norm: 0.0140 loss: 0.6822 2023/05/31 21:41:56 - mmengine - INFO - Epoch(train) [5][2300/5758] lr: 9.0546e-04 eta: 11:04:34 time: 0.4810 data_time: 0.0025 memory: 20334 grad_norm: 0.0115 loss: 0.6844 2023/05/31 21:42:40 - mmengine - INFO - Epoch(train) [5][2400/5758] lr: 9.0546e-04 eta: 11:03:47 time: 0.4656 data_time: 0.0023 memory: 20334 grad_norm: 0.0259 loss: 0.6852 2023/05/31 21:43:24 - mmengine - INFO - Epoch(train) [5][2500/5758] lr: 9.0546e-04 eta: 11:03:02 time: 0.4012 data_time: 0.0022 memory: 20334 grad_norm: 0.0133 loss: 0.6867 2023/05/31 21:44:07 - mmengine - INFO - Epoch(train) [5][2600/5758] lr: 9.0546e-04 eta: 11:02:13 time: 0.3986 data_time: 0.0025 memory: 20334 grad_norm: 0.0169 loss: 0.6808 2023/05/31 21:44:50 - mmengine - INFO - Epoch(train) [5][2700/5758] lr: 9.0546e-04 eta: 11:01:25 time: 0.4566 data_time: 0.0017 memory: 20334 grad_norm: 0.0106 loss: 0.6883 2023/05/31 21:45:32 - mmengine - INFO - Epoch(train) [5][2800/5758] lr: 9.0546e-04 eta: 11:00:30 time: 0.3927 data_time: 0.0016 memory: 20334 grad_norm: 0.0138 loss: 0.6906 2023/05/31 21:46:13 - mmengine - INFO - Epoch(train) [5][2900/5758] lr: 9.0546e-04 eta: 10:59:36 time: 0.4240 data_time: 0.0020 memory: 20334 grad_norm: 0.0143 loss: 0.6879 2023/05/31 21:46:43 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 21:46:56 - mmengine - INFO - Epoch(train) [5][3000/5758] lr: 9.0546e-04 eta: 10:58:48 time: 0.4904 data_time: 0.0023 memory: 20334 grad_norm: 0.0263 loss: 0.6910 2023/05/31 21:47:38 - mmengine - INFO - Epoch(train) [5][3100/5758] lr: 9.0546e-04 eta: 10:57:56 time: 0.4185 data_time: 0.0018 memory: 20334 grad_norm: 0.0194 loss: 0.6845 2023/05/31 21:48:21 - mmengine - INFO - Epoch(train) [5][3200/5758] lr: 9.0546e-04 eta: 10:57:05 time: 0.4449 data_time: 0.0017 memory: 20334 grad_norm: 0.0220 loss: 0.6875 2023/05/31 21:49:05 - mmengine - INFO - Epoch(train) [5][3300/5758] lr: 9.0546e-04 eta: 10:56:20 time: 0.3759 data_time: 0.0014 memory: 20334 grad_norm: 0.0263 loss: 0.6842 2023/05/31 21:49:49 - mmengine - INFO - Epoch(train) [5][3400/5758] lr: 9.0546e-04 eta: 10:55:33 time: 0.5078 data_time: 0.0015 memory: 20334 grad_norm: 0.0196 loss: 0.6860 2023/05/31 21:50:32 - mmengine - INFO - Epoch(train) [5][3500/5758] lr: 9.0546e-04 eta: 10:54:44 time: 0.5131 data_time: 0.0022 memory: 20334 grad_norm: 0.0172 loss: 0.6866 2023/05/31 21:51:15 - mmengine - INFO - Epoch(train) [5][3600/5758] lr: 9.0546e-04 eta: 10:53:59 time: 0.5038 data_time: 0.0016 memory: 20334 grad_norm: 0.0099 loss: 0.6881 2023/05/31 21:51:59 - mmengine - INFO - Epoch(train) [5][3700/5758] lr: 9.0546e-04 eta: 10:53:12 time: 0.4277 data_time: 0.0022 memory: 20334 grad_norm: 0.0164 loss: 0.6874 2023/05/31 21:52:41 - mmengine - INFO - Epoch(train) [5][3800/5758] lr: 9.0546e-04 eta: 10:52:20 time: 0.4236 data_time: 0.0025 memory: 20334 grad_norm: 0.0197 loss: 0.6891 2023/05/31 21:53:26 - mmengine - INFO - Epoch(train) [5][3900/5758] lr: 9.0546e-04 eta: 10:51:37 time: 0.4664 data_time: 0.0020 memory: 20334 grad_norm: 0.0167 loss: 0.6835 2023/05/31 21:53:59 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 21:54:12 - mmengine - INFO - Epoch(train) [5][4000/5758] lr: 9.0546e-04 eta: 10:50:58 time: 0.3931 data_time: 0.0024 memory: 20334 grad_norm: 0.0188 loss: 0.6852 2023/05/31 21:54:53 - mmengine - INFO - Epoch(train) [5][4100/5758] lr: 9.0546e-04 eta: 10:50:04 time: 0.3859 data_time: 0.0015 memory: 20334 grad_norm: 0.0138 loss: 0.6881 2023/05/31 21:55:37 - mmengine - INFO - Epoch(train) [5][4200/5758] lr: 9.0546e-04 eta: 10:49:17 time: 0.4127 data_time: 0.0015 memory: 20334 grad_norm: 0.0129 loss: 0.6901 2023/05/31 21:56:22 - mmengine - INFO - Epoch(train) [5][4300/5758] lr: 9.0546e-04 eta: 10:48:37 time: 0.4177 data_time: 0.0017 memory: 20334 grad_norm: 0.0163 loss: 0.6870 2023/05/31 21:57:05 - mmengine - INFO - Epoch(train) [5][4400/5758] lr: 9.0546e-04 eta: 10:47:47 time: 0.4095 data_time: 0.0016 memory: 20334 grad_norm: 0.0224 loss: 0.6888 2023/05/31 21:57:47 - mmengine - INFO - Epoch(train) [5][4500/5758] lr: 9.0546e-04 eta: 10:46:57 time: 0.4457 data_time: 0.0017 memory: 20334 grad_norm: 0.0121 loss: 0.6857 2023/05/31 21:58:33 - mmengine - INFO - Epoch(train) [5][4600/5758] lr: 9.0546e-04 eta: 10:46:16 time: 0.5404 data_time: 0.0016 memory: 20334 grad_norm: 0.0148 loss: 0.6922 2023/05/31 21:59:16 - mmengine - INFO - Epoch(train) [5][4700/5758] lr: 9.0546e-04 eta: 10:45:27 time: 0.4003 data_time: 0.0026 memory: 20334 grad_norm: 0.0248 loss: 0.6810 2023/05/31 22:00:00 - mmengine - INFO - Epoch(train) [5][4800/5758] lr: 9.0546e-04 eta: 10:44:43 time: 0.4443 data_time: 0.0026 memory: 20334 grad_norm: 0.0229 loss: 0.6823 2023/05/31 22:00:44 - mmengine - INFO - Epoch(train) [5][4900/5758] lr: 9.0546e-04 eta: 10:43:57 time: 0.4572 data_time: 0.0023 memory: 20334 grad_norm: 0.0174 loss: 0.6865 2023/05/31 22:01:14 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 22:01:29 - mmengine - INFO - Epoch(train) [5][5000/5758] lr: 9.0546e-04 eta: 10:43:15 time: 0.4226 data_time: 0.0025 memory: 20334 grad_norm: 0.0185 loss: 0.6900 2023/05/31 22:02:13 - mmengine - INFO - Epoch(train) [5][5100/5758] lr: 9.0546e-04 eta: 10:42:31 time: 0.4421 data_time: 0.0024 memory: 20334 grad_norm: 0.0099 loss: 0.6917 2023/05/31 22:02:59 - mmengine - INFO - Epoch(train) [5][5200/5758] lr: 9.0546e-04 eta: 10:41:52 time: 0.4703 data_time: 0.0024 memory: 20334 grad_norm: 0.0149 loss: 0.6890 2023/05/31 22:03:43 - mmengine - INFO - Epoch(train) [5][5300/5758] lr: 9.0546e-04 eta: 10:41:06 time: 0.4200 data_time: 0.0026 memory: 20334 grad_norm: 0.0187 loss: 0.6927 2023/05/31 22:04:27 - mmengine - INFO - Epoch(train) [5][5400/5758] lr: 9.0546e-04 eta: 10:40:21 time: 0.4258 data_time: 0.0025 memory: 20334 grad_norm: 0.0158 loss: 0.6880 2023/05/31 22:05:11 - mmengine - INFO - Epoch(train) [5][5500/5758] lr: 9.0546e-04 eta: 10:39:37 time: 0.4851 data_time: 0.0023 memory: 20334 grad_norm: 0.0193 loss: 0.6876 2023/05/31 22:05:56 - mmengine - INFO - Epoch(train) [5][5600/5758] lr: 9.0546e-04 eta: 10:38:54 time: 0.4599 data_time: 0.0018 memory: 20334 grad_norm: 0.0201 loss: 0.6833 2023/05/31 22:06:41 - mmengine - INFO - Epoch(train) [5][5700/5758] lr: 9.0546e-04 eta: 10:38:10 time: 0.4305 data_time: 0.0022 memory: 20334 grad_norm: 0.0176 loss: 0.6856 2023/05/31 22:07:05 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 22:07:05 - mmengine - INFO - Saving checkpoint at 5 epochs 2023/05/31 22:07:22 - mmengine - INFO - Epoch(val) [5][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.3672 time: 0.9941 2023/05/31 22:08:08 - mmengine - INFO - Epoch(train) [6][ 100/5758] lr: 8.5502e-04 eta: 10:37:02 time: 0.4215 data_time: 0.0017 memory: 20334 grad_norm: 0.0107 loss: 0.6819 2023/05/31 22:08:50 - mmengine - INFO - Epoch(train) [6][ 200/5758] lr: 8.5502e-04 eta: 10:36:11 time: 0.4055 data_time: 0.0017 memory: 20334 grad_norm: 0.0142 loss: 0.6858 2023/05/31 22:08:55 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 22:09:31 - mmengine - INFO - Epoch(train) [6][ 300/5758] lr: 8.5502e-04 eta: 10:35:16 time: 0.4040 data_time: 0.0022 memory: 20334 grad_norm: 0.0184 loss: 0.6853 2023/05/31 22:10:10 - mmengine - INFO - Epoch(train) [6][ 400/5758] lr: 8.5502e-04 eta: 10:34:17 time: 0.3814 data_time: 0.0017 memory: 20334 grad_norm: 0.0178 loss: 0.6879 2023/05/31 22:10:52 - mmengine - INFO - Epoch(train) [6][ 500/5758] lr: 8.5502e-04 eta: 10:33:25 time: 0.4531 data_time: 0.0023 memory: 20334 grad_norm: 0.0173 loss: 0.6870 2023/05/31 22:11:31 - mmengine - INFO - Epoch(train) [6][ 600/5758] lr: 8.5502e-04 eta: 10:32:27 time: 0.3860 data_time: 0.0020 memory: 20334 grad_norm: 0.0177 loss: 0.6863 2023/05/31 22:12:13 - mmengine - INFO - Epoch(train) [6][ 700/5758] lr: 8.5502e-04 eta: 10:31:36 time: 0.4217 data_time: 0.0020 memory: 20334 grad_norm: 0.0179 loss: 0.6891 2023/05/31 22:12:53 - mmengine - INFO - Epoch(train) [6][ 800/5758] lr: 8.5502e-04 eta: 10:30:40 time: 0.3811 data_time: 0.0017 memory: 20334 grad_norm: 0.0186 loss: 0.6859 2023/05/31 22:13:35 - mmengine - INFO - Epoch(train) [6][ 900/5758] lr: 8.5502e-04 eta: 10:29:47 time: 0.4309 data_time: 0.0031 memory: 20334 grad_norm: 0.0150 loss: 0.6860 2023/05/31 22:14:16 - mmengine - INFO - Epoch(train) [6][1000/5758] lr: 8.5502e-04 eta: 10:28:54 time: 0.4587 data_time: 0.0024 memory: 20334 grad_norm: 0.0151 loss: 0.6870 2023/05/31 22:14:58 - mmengine - INFO - Epoch(train) [6][1100/5758] lr: 8.5502e-04 eta: 10:28:04 time: 0.4721 data_time: 0.0026 memory: 20334 grad_norm: 0.0219 loss: 0.6868 2023/05/31 22:15:39 - mmengine - INFO - Epoch(train) [6][1200/5758] lr: 8.5502e-04 eta: 10:27:10 time: 0.4290 data_time: 0.0030 memory: 20334 grad_norm: 0.0190 loss: 0.6873 2023/05/31 22:15:43 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 22:16:19 - mmengine - INFO - Epoch(train) [6][1300/5758] lr: 8.5502e-04 eta: 10:26:16 time: 0.4148 data_time: 0.0020 memory: 20334 grad_norm: 0.0140 loss: 0.6891 2023/05/31 22:17:00 - mmengine - INFO - Epoch(train) [6][1400/5758] lr: 8.5502e-04 eta: 10:25:22 time: 0.3663 data_time: 0.0024 memory: 20334 grad_norm: 0.0121 loss: 0.6891 2023/05/31 22:17:41 - mmengine - INFO - Epoch(train) [6][1500/5758] lr: 8.5502e-04 eta: 10:24:29 time: 0.4630 data_time: 0.0023 memory: 20334 grad_norm: 0.0190 loss: 0.6904 2023/05/31 22:18:23 - mmengine - INFO - Epoch(train) [6][1600/5758] lr: 8.5502e-04 eta: 10:23:39 time: 0.4389 data_time: 0.0030 memory: 20334 grad_norm: 0.0143 loss: 0.6874 2023/05/31 22:19:03 - mmengine - INFO - Epoch(train) [6][1700/5758] lr: 8.5502e-04 eta: 10:22:44 time: 0.3718 data_time: 0.0019 memory: 20334 grad_norm: 0.0181 loss: 0.6867 2023/05/31 22:19:45 - mmengine - INFO - Epoch(train) [6][1800/5758] lr: 8.5502e-04 eta: 10:21:53 time: 0.3911 data_time: 0.0017 memory: 20334 grad_norm: 0.0164 loss: 0.6881 2023/05/31 22:20:26 - mmengine - INFO - Epoch(train) [6][1900/5758] lr: 8.5502e-04 eta: 10:21:01 time: 0.4190 data_time: 0.0026 memory: 20334 grad_norm: 0.0180 loss: 0.6916 2023/05/31 22:21:06 - mmengine - INFO - Epoch(train) [6][2000/5758] lr: 8.5502e-04 eta: 10:20:05 time: 0.3862 data_time: 0.0022 memory: 20334 grad_norm: 0.0198 loss: 0.6882 2023/05/31 22:21:47 - mmengine - INFO - Epoch(train) [6][2100/5758] lr: 8.5502e-04 eta: 10:19:12 time: 0.4260 data_time: 0.0032 memory: 20334 grad_norm: 0.0146 loss: 0.6861 2023/05/31 22:22:29 - mmengine - INFO - Epoch(train) [6][2200/5758] lr: 8.5502e-04 eta: 10:18:23 time: 0.4317 data_time: 0.0025 memory: 20334 grad_norm: 0.0162 loss: 0.6831 2023/05/31 22:22:33 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 22:23:08 - mmengine - INFO - Epoch(train) [6][2300/5758] lr: 8.5502e-04 eta: 10:17:26 time: 0.3788 data_time: 0.0024 memory: 20334 grad_norm: 0.0135 loss: 0.6834 2023/05/31 22:23:50 - mmengine - INFO - Epoch(train) [6][2400/5758] lr: 8.5502e-04 eta: 10:16:34 time: 0.4986 data_time: 0.0027 memory: 20334 grad_norm: 0.0187 loss: 0.6832 2023/05/31 22:24:31 - mmengine - INFO - Epoch(train) [6][2500/5758] lr: 8.5502e-04 eta: 10:15:43 time: 0.4379 data_time: 0.0024 memory: 20334 grad_norm: 0.0160 loss: 0.6868 2023/05/31 22:25:12 - mmengine - INFO - Epoch(train) [6][2600/5758] lr: 8.5502e-04 eta: 10:14:51 time: 0.4220 data_time: 0.0025 memory: 20334 grad_norm: 0.0152 loss: 0.6876 2023/05/31 22:25:53 - mmengine - INFO - Epoch(train) [6][2700/5758] lr: 8.5502e-04 eta: 10:14:00 time: 0.3924 data_time: 0.0022 memory: 20334 grad_norm: 0.0156 loss: 0.6884 2023/05/31 22:26:33 - mmengine - INFO - Epoch(train) [6][2800/5758] lr: 8.5502e-04 eta: 10:13:04 time: 0.3723 data_time: 0.0024 memory: 20334 grad_norm: 0.0140 loss: 0.6841 2023/05/31 22:27:16 - mmengine - INFO - Epoch(train) [6][2900/5758] lr: 8.5502e-04 eta: 10:12:16 time: 0.4717 data_time: 0.0028 memory: 20334 grad_norm: 0.0248 loss: 0.6864 2023/05/31 22:27:56 - mmengine - INFO - Epoch(train) [6][3000/5758] lr: 8.5502e-04 eta: 10:11:23 time: 0.4016 data_time: 0.0021 memory: 20334 grad_norm: 0.0141 loss: 0.6857 2023/05/31 22:28:38 - mmengine - INFO - Epoch(train) [6][3100/5758] lr: 8.5502e-04 eta: 10:10:33 time: 0.4071 data_time: 0.0019 memory: 20334 grad_norm: 0.0214 loss: 0.6921 2023/05/31 22:29:19 - mmengine - INFO - Epoch(train) [6][3200/5758] lr: 8.5502e-04 eta: 10:09:41 time: 0.4218 data_time: 0.0025 memory: 20334 grad_norm: 0.0125 loss: 0.6927 2023/05/31 22:29:23 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 22:30:00 - mmengine - INFO - Epoch(train) [6][3300/5758] lr: 8.5502e-04 eta: 10:08:51 time: 0.3823 data_time: 0.0025 memory: 20334 grad_norm: 0.0199 loss: 0.6829 2023/05/31 22:30:41 - mmengine - INFO - Epoch(train) [6][3400/5758] lr: 8.5502e-04 eta: 10:07:57 time: 0.4505 data_time: 0.0018 memory: 20334 grad_norm: 0.0151 loss: 0.6896 2023/05/31 22:31:21 - mmengine - INFO - Epoch(train) [6][3500/5758] lr: 8.5502e-04 eta: 10:07:05 time: 0.4410 data_time: 0.0022 memory: 20334 grad_norm: 0.0186 loss: 0.6846 2023/05/31 22:32:02 - mmengine - INFO - Epoch(train) [6][3600/5758] lr: 8.5502e-04 eta: 10:06:13 time: 0.3710 data_time: 0.0017 memory: 20334 grad_norm: 0.0135 loss: 0.6896 2023/05/31 22:32:43 - mmengine - INFO - Epoch(train) [6][3700/5758] lr: 8.5502e-04 eta: 10:05:21 time: 0.4106 data_time: 0.0021 memory: 20334 grad_norm: 0.0225 loss: 0.6931 2023/05/31 22:33:23 - mmengine - INFO - Epoch(train) [6][3800/5758] lr: 8.5502e-04 eta: 10:04:27 time: 0.4088 data_time: 0.0025 memory: 20334 grad_norm: 0.0137 loss: 0.6867 2023/05/31 22:34:04 - mmengine - INFO - Epoch(train) [6][3900/5758] lr: 8.5502e-04 eta: 10:03:35 time: 0.4119 data_time: 0.0017 memory: 20334 grad_norm: 0.0203 loss: 0.6895 2023/05/31 22:34:44 - mmengine - INFO - Epoch(train) [6][4000/5758] lr: 8.5502e-04 eta: 10:02:41 time: 0.3971 data_time: 0.0028 memory: 20334 grad_norm: 0.0173 loss: 0.6862 2023/05/31 22:35:25 - mmengine - INFO - Epoch(train) [6][4100/5758] lr: 8.5502e-04 eta: 10:01:51 time: 0.3677 data_time: 0.0017 memory: 20334 grad_norm: 0.0246 loss: 0.6876 2023/05/31 22:36:06 - mmengine - INFO - Epoch(train) [6][4200/5758] lr: 8.5502e-04 eta: 10:01:00 time: 0.4329 data_time: 0.0016 memory: 20334 grad_norm: 0.0176 loss: 0.6919 2023/05/31 22:36:10 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 22:36:47 - mmengine - INFO - Epoch(train) [6][4300/5758] lr: 8.5502e-04 eta: 10:00:09 time: 0.4061 data_time: 0.0029 memory: 20334 grad_norm: 0.0139 loss: 0.6934 2023/05/31 22:37:27 - mmengine - INFO - Epoch(train) [6][4400/5758] lr: 8.5502e-04 eta: 9:59:16 time: 0.4042 data_time: 0.0019 memory: 20334 grad_norm: 0.0120 loss: 0.6842 2023/05/31 22:38:08 - mmengine - INFO - Epoch(train) [6][4500/5758] lr: 8.5502e-04 eta: 9:58:25 time: 0.3866 data_time: 0.0019 memory: 20334 grad_norm: 0.0176 loss: 0.6855 2023/05/31 22:38:50 - mmengine - INFO - Epoch(train) [6][4600/5758] lr: 8.5502e-04 eta: 9:57:36 time: 0.3988 data_time: 0.0020 memory: 20334 grad_norm: 0.0179 loss: 0.6893 2023/05/31 22:39:30 - mmengine - INFO - Epoch(train) [6][4700/5758] lr: 8.5502e-04 eta: 9:56:43 time: 0.4070 data_time: 0.0023 memory: 20334 grad_norm: 0.0114 loss: 0.6911 2023/05/31 22:40:10 - mmengine - INFO - Epoch(train) [6][4800/5758] lr: 8.5502e-04 eta: 9:55:50 time: 0.3735 data_time: 0.0022 memory: 20334 grad_norm: 0.0144 loss: 0.6865 2023/05/31 22:40:50 - mmengine - INFO - Epoch(train) [6][4900/5758] lr: 8.5502e-04 eta: 9:54:57 time: 0.3939 data_time: 0.0024 memory: 20334 grad_norm: 0.0183 loss: 0.6883 2023/05/31 22:41:30 - mmengine - INFO - Epoch(train) [6][5000/5758] lr: 8.5502e-04 eta: 9:54:04 time: 0.4602 data_time: 0.0022 memory: 20334 grad_norm: 0.0245 loss: 0.6903 2023/05/31 22:42:11 - mmengine - INFO - Epoch(train) [6][5100/5758] lr: 8.5502e-04 eta: 9:53:13 time: 0.4275 data_time: 0.0022 memory: 20334 grad_norm: 0.0258 loss: 0.6845 2023/05/31 22:42:52 - mmengine - INFO - Epoch(train) [6][5200/5758] lr: 8.5502e-04 eta: 9:52:21 time: 0.4513 data_time: 0.0023 memory: 20334 grad_norm: 0.0158 loss: 0.6855 2023/05/31 22:42:56 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 22:43:34 - mmengine - INFO - Epoch(train) [6][5300/5758] lr: 8.5502e-04 eta: 9:51:34 time: 0.4971 data_time: 0.0019 memory: 20334 grad_norm: 0.0185 loss: 0.6863 2023/05/31 22:44:14 - mmengine - INFO - Epoch(train) [6][5400/5758] lr: 8.5502e-04 eta: 9:50:42 time: 0.4245 data_time: 0.0024 memory: 20334 grad_norm: 0.0111 loss: 0.6906 2023/05/31 22:44:55 - mmengine - INFO - Epoch(train) [6][5500/5758] lr: 8.5502e-04 eta: 9:49:51 time: 0.4005 data_time: 0.0028 memory: 20334 grad_norm: 0.0163 loss: 0.6868 2023/05/31 22:45:42 - mmengine - INFO - Epoch(train) [6][5600/5758] lr: 8.5502e-04 eta: 9:49:15 time: 0.5972 data_time: 0.0018 memory: 20334 grad_norm: 0.0120 loss: 0.6849 2023/05/31 22:46:22 - mmengine - INFO - Epoch(train) [6][5700/5758] lr: 8.5502e-04 eta: 9:48:22 time: 0.3656 data_time: 0.0018 memory: 20334 grad_norm: 0.0100 loss: 0.6855 2023/05/31 22:46:44 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 22:46:44 - mmengine - INFO - Saving checkpoint at 6 epochs 2023/05/31 22:47:03 - mmengine - INFO - Epoch(val) [6][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.3767 time: 1.0327 2023/05/31 22:47:51 - mmengine - INFO - Epoch(train) [7][ 100/5758] lr: 7.9595e-04 eta: 9:47:15 time: 0.3915 data_time: 0.0027 memory: 20334 grad_norm: 0.0088 loss: 0.6865 2023/05/31 22:48:32 - mmengine - INFO - Epoch(train) [7][ 200/5758] lr: 7.9595e-04 eta: 9:46:25 time: 0.3915 data_time: 0.0023 memory: 20334 grad_norm: 0.0178 loss: 0.6882 2023/05/31 22:49:13 - mmengine - INFO - Epoch(train) [7][ 300/5758] lr: 7.9595e-04 eta: 9:45:36 time: 0.4006 data_time: 0.0028 memory: 20334 grad_norm: 0.0171 loss: 0.6869 2023/05/31 22:49:55 - mmengine - INFO - Epoch(train) [7][ 400/5758] lr: 7.9595e-04 eta: 9:44:48 time: 0.4888 data_time: 0.0027 memory: 20334 grad_norm: 0.0208 loss: 0.6847 2023/05/31 22:50:17 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 22:50:38 - mmengine - INFO - Epoch(train) [7][ 500/5758] lr: 7.9595e-04 eta: 9:44:01 time: 0.4256 data_time: 0.0025 memory: 20334 grad_norm: 0.0154 loss: 0.6861 2023/05/31 22:51:18 - mmengine - INFO - Epoch(train) [7][ 600/5758] lr: 7.9595e-04 eta: 9:43:11 time: 0.4233 data_time: 0.0031 memory: 20334 grad_norm: 0.0174 loss: 0.6872 2023/05/31 22:52:00 - mmengine - INFO - Epoch(train) [7][ 700/5758] lr: 7.9595e-04 eta: 9:42:22 time: 0.4026 data_time: 0.0020 memory: 20334 grad_norm: 0.0144 loss: 0.6850 2023/05/31 22:53:08 - mmengine - INFO - Epoch(train) [7][ 800/5758] lr: 7.9595e-04 eta: 9:42:33 time: 1.0925 data_time: 0.0024 memory: 20334 grad_norm: 0.0130 loss: 0.6876 2023/05/31 22:53:45 - mmengine - INFO - Epoch(train) [7][ 900/5758] lr: 7.9595e-04 eta: 9:41:35 time: 0.3737 data_time: 0.0025 memory: 20334 grad_norm: 0.0184 loss: 0.6845 2023/05/31 22:54:24 - mmengine - INFO - Epoch(train) [7][1000/5758] lr: 7.9595e-04 eta: 9:40:41 time: 0.3978 data_time: 0.0023 memory: 20334 grad_norm: 0.0258 loss: 0.6840 2023/05/31 22:55:04 - mmengine - INFO - Epoch(train) [7][1100/5758] lr: 7.9595e-04 eta: 9:39:48 time: 0.3873 data_time: 0.0025 memory: 20334 grad_norm: 0.0184 loss: 0.6846 2023/05/31 22:55:45 - mmengine - INFO - Epoch(train) [7][1200/5758] lr: 7.9595e-04 eta: 9:38:58 time: 0.4038 data_time: 0.0028 memory: 20334 grad_norm: 0.0246 loss: 0.6853 2023/05/31 22:56:26 - mmengine - INFO - Epoch(train) [7][1300/5758] lr: 7.9595e-04 eta: 9:38:07 time: 0.4370 data_time: 0.0019 memory: 20334 grad_norm: 0.0178 loss: 0.6870 2023/05/31 22:57:05 - mmengine - INFO - Epoch(train) [7][1400/5758] lr: 7.9595e-04 eta: 9:37:14 time: 0.3937 data_time: 0.0030 memory: 20334 grad_norm: 0.0242 loss: 0.6844 2023/05/31 22:57:25 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 22:57:45 - mmengine - INFO - Epoch(train) [7][1500/5758] lr: 7.9595e-04 eta: 9:36:21 time: 0.4178 data_time: 0.0019 memory: 20334 grad_norm: 0.0087 loss: 0.6863 2023/05/31 22:58:24 - mmengine - INFO - Epoch(train) [7][1600/5758] lr: 7.9595e-04 eta: 9:35:28 time: 0.4099 data_time: 0.0021 memory: 20334 grad_norm: 0.0165 loss: 0.6887 2023/05/31 22:59:05 - mmengine - INFO - Epoch(train) [7][1700/5758] lr: 7.9595e-04 eta: 9:34:39 time: 0.4031 data_time: 0.0026 memory: 20334 grad_norm: 0.0192 loss: 0.6890 2023/05/31 22:59:46 - mmengine - INFO - Epoch(train) [7][1800/5758] lr: 7.9595e-04 eta: 9:33:48 time: 0.3943 data_time: 0.0024 memory: 20334 grad_norm: 0.0139 loss: 0.6847 2023/05/31 23:00:27 - mmengine - INFO - Epoch(train) [7][1900/5758] lr: 7.9595e-04 eta: 9:32:59 time: 0.4035 data_time: 0.0023 memory: 20334 grad_norm: 0.0245 loss: 0.6876 2023/05/31 23:01:09 - mmengine - INFO - Epoch(train) [7][2000/5758] lr: 7.9595e-04 eta: 9:32:11 time: 0.4083 data_time: 0.0030 memory: 20334 grad_norm: 0.0135 loss: 0.6861 2023/05/31 23:01:49 - mmengine - INFO - Epoch(train) [7][2100/5758] lr: 7.9595e-04 eta: 9:31:20 time: 0.3640 data_time: 0.0024 memory: 20334 grad_norm: 0.0168 loss: 0.6863 2023/05/31 23:02:29 - mmengine - INFO - Epoch(train) [7][2200/5758] lr: 7.9595e-04 eta: 9:30:29 time: 0.4015 data_time: 0.0023 memory: 20334 grad_norm: 0.0144 loss: 0.6854 2023/05/31 23:03:09 - mmengine - INFO - Epoch(train) [7][2300/5758] lr: 7.9595e-04 eta: 9:29:38 time: 0.3630 data_time: 0.0021 memory: 20334 grad_norm: 0.0163 loss: 0.6862 2023/05/31 23:03:50 - mmengine - INFO - Epoch(train) [7][2400/5758] lr: 7.9595e-04 eta: 9:28:47 time: 0.3996 data_time: 0.0017 memory: 20334 grad_norm: 0.0198 loss: 0.6823 2023/05/31 23:04:10 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 23:04:30 - mmengine - INFO - Epoch(train) [7][2500/5758] lr: 7.9595e-04 eta: 9:27:56 time: 0.4455 data_time: 0.0023 memory: 20334 grad_norm: 0.0218 loss: 0.6903 2023/05/31 23:05:10 - mmengine - INFO - Epoch(train) [7][2600/5758] lr: 7.9595e-04 eta: 9:27:06 time: 0.4206 data_time: 0.0026 memory: 20334 grad_norm: 0.0181 loss: 0.6891 2023/05/31 23:05:51 - mmengine - INFO - Epoch(train) [7][2700/5758] lr: 7.9595e-04 eta: 9:26:16 time: 0.3833 data_time: 0.0017 memory: 20334 grad_norm: 0.0279 loss: 0.6872 2023/05/31 23:06:32 - mmengine - INFO - Epoch(train) [7][2800/5758] lr: 7.9595e-04 eta: 9:25:27 time: 0.4038 data_time: 0.0017 memory: 20334 grad_norm: 0.0139 loss: 0.6853 2023/05/31 23:07:12 - mmengine - INFO - Epoch(train) [7][2900/5758] lr: 7.9595e-04 eta: 9:24:36 time: 0.3934 data_time: 0.0025 memory: 20334 grad_norm: 0.0113 loss: 0.6854 2023/05/31 23:07:52 - mmengine - INFO - Epoch(train) [7][3000/5758] lr: 7.9595e-04 eta: 9:23:45 time: 0.4036 data_time: 0.0023 memory: 20334 grad_norm: 0.0171 loss: 0.6871 2023/05/31 23:08:32 - mmengine - INFO - Epoch(train) [7][3100/5758] lr: 7.9595e-04 eta: 9:22:55 time: 0.4592 data_time: 0.0017 memory: 20334 grad_norm: 0.0211 loss: 0.6862 2023/05/31 23:09:13 - mmengine - INFO - Epoch(train) [7][3200/5758] lr: 7.9595e-04 eta: 9:22:05 time: 0.3824 data_time: 0.0029 memory: 20334 grad_norm: 0.0181 loss: 0.6914 2023/05/31 23:09:53 - mmengine - INFO - Epoch(train) [7][3300/5758] lr: 7.9595e-04 eta: 9:21:15 time: 0.4048 data_time: 0.0025 memory: 20334 grad_norm: 0.0138 loss: 0.6873 2023/05/31 23:10:33 - mmengine - INFO - Epoch(train) [7][3400/5758] lr: 7.9595e-04 eta: 9:20:23 time: 0.3673 data_time: 0.0021 memory: 20334 grad_norm: 0.0177 loss: 0.6882 2023/05/31 23:10:54 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 23:11:13 - mmengine - INFO - Epoch(train) [7][3500/5758] lr: 7.9595e-04 eta: 9:19:32 time: 0.4064 data_time: 0.0025 memory: 20334 grad_norm: 0.0139 loss: 0.6901 2023/05/31 23:11:54 - mmengine - INFO - Epoch(train) [7][3600/5758] lr: 7.9595e-04 eta: 9:18:43 time: 0.4259 data_time: 0.0017 memory: 20334 grad_norm: 0.0126 loss: 0.6852 2023/05/31 23:12:34 - mmengine - INFO - Epoch(train) [7][3700/5758] lr: 7.9595e-04 eta: 9:17:53 time: 0.4189 data_time: 0.0026 memory: 20334 grad_norm: 0.0186 loss: 0.6862 2023/05/31 23:13:15 - mmengine - INFO - Epoch(train) [7][3800/5758] lr: 7.9595e-04 eta: 9:17:05 time: 0.3871 data_time: 0.0015 memory: 20334 grad_norm: 0.0215 loss: 0.6917 2023/05/31 23:13:55 - mmengine - INFO - Epoch(train) [7][3900/5758] lr: 7.9595e-04 eta: 9:16:15 time: 0.4560 data_time: 0.0015 memory: 20334 grad_norm: 0.0234 loss: 0.6864 2023/05/31 23:14:35 - mmengine - INFO - Epoch(train) [7][4000/5758] lr: 7.9595e-04 eta: 9:15:24 time: 0.3990 data_time: 0.0019 memory: 20334 grad_norm: 0.0159 loss: 0.6854 2023/05/31 23:15:16 - mmengine - INFO - Epoch(train) [7][4100/5758] lr: 7.9595e-04 eta: 9:14:35 time: 0.3968 data_time: 0.0016 memory: 20334 grad_norm: 0.0163 loss: 0.6881 2023/05/31 23:15:55 - mmengine - INFO - Epoch(train) [7][4200/5758] lr: 7.9595e-04 eta: 9:13:43 time: 0.4082 data_time: 0.0016 memory: 20334 grad_norm: 0.0161 loss: 0.6845 2023/05/31 23:16:37 - mmengine - INFO - Epoch(train) [7][4300/5758] lr: 7.9595e-04 eta: 9:12:57 time: 0.3830 data_time: 0.0019 memory: 20334 grad_norm: 0.0181 loss: 0.6873 2023/05/31 23:17:18 - mmengine - INFO - Epoch(train) [7][4400/5758] lr: 7.9595e-04 eta: 9:12:08 time: 0.4112 data_time: 0.0026 memory: 20334 grad_norm: 0.0142 loss: 0.6873 2023/05/31 23:17:38 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 23:17:58 - mmengine - INFO - Epoch(train) [7][4500/5758] lr: 7.9595e-04 eta: 9:11:19 time: 0.3989 data_time: 0.0018 memory: 20334 grad_norm: 0.0202 loss: 0.6883 2023/05/31 23:18:40 - mmengine - INFO - Epoch(train) [7][4600/5758] lr: 7.9595e-04 eta: 9:10:31 time: 0.4141 data_time: 0.0017 memory: 20334 grad_norm: 0.0102 loss: 0.6852 2023/05/31 23:19:20 - mmengine - INFO - Epoch(train) [7][4700/5758] lr: 7.9595e-04 eta: 9:09:41 time: 0.5179 data_time: 0.0019 memory: 20334 grad_norm: 0.0136 loss: 0.6882 2023/05/31 23:20:01 - mmengine - INFO - Epoch(train) [7][4800/5758] lr: 7.9595e-04 eta: 9:08:53 time: 0.4319 data_time: 0.0017 memory: 20334 grad_norm: 0.0147 loss: 0.6873 2023/05/31 23:20:42 - mmengine - INFO - Epoch(train) [7][4900/5758] lr: 7.9595e-04 eta: 9:08:05 time: 0.3954 data_time: 0.0017 memory: 20334 grad_norm: 0.0126 loss: 0.6876 2023/05/31 23:21:22 - mmengine - INFO - Epoch(train) [7][5000/5758] lr: 7.9595e-04 eta: 9:07:16 time: 0.3781 data_time: 0.0016 memory: 20334 grad_norm: 0.0188 loss: 0.6844 2023/05/31 23:22:03 - mmengine - INFO - Epoch(train) [7][5100/5758] lr: 7.9595e-04 eta: 9:06:26 time: 0.4298 data_time: 0.0018 memory: 20334 grad_norm: 0.0119 loss: 0.6844 2023/05/31 23:22:44 - mmengine - INFO - Epoch(train) [7][5200/5758] lr: 7.9595e-04 eta: 9:05:39 time: 0.4400 data_time: 0.0017 memory: 20334 grad_norm: 0.0133 loss: 0.6878 2023/05/31 23:23:26 - mmengine - INFO - Epoch(train) [7][5300/5758] lr: 7.9595e-04 eta: 9:04:52 time: 0.4161 data_time: 0.0020 memory: 20334 grad_norm: 0.0077 loss: 0.6867 2023/05/31 23:24:08 - mmengine - INFO - Epoch(train) [7][5400/5758] lr: 7.9595e-04 eta: 9:04:08 time: 0.4377 data_time: 0.0017 memory: 20334 grad_norm: 0.0167 loss: 0.6833 2023/05/31 23:24:31 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 23:24:51 - mmengine - INFO - Epoch(train) [7][5500/5758] lr: 7.9595e-04 eta: 9:03:22 time: 0.3777 data_time: 0.0021 memory: 20334 grad_norm: 0.0198 loss: 0.6925 2023/05/31 23:25:29 - mmengine - INFO - Epoch(train) [7][5600/5758] lr: 7.9595e-04 eta: 9:02:30 time: 0.4018 data_time: 0.0017 memory: 20334 grad_norm: 0.0204 loss: 0.6840 2023/05/31 23:26:09 - mmengine - INFO - Epoch(train) [7][5700/5758] lr: 7.9595e-04 eta: 9:01:40 time: 0.3764 data_time: 0.0021 memory: 20334 grad_norm: 0.0142 loss: 0.6851 2023/05/31 23:26:34 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 23:26:34 - mmengine - INFO - Saving checkpoint at 7 epochs 2023/05/31 23:26:51 - mmengine - INFO - Epoch(val) [7][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.3704 time: 1.0113 2023/05/31 23:27:36 - mmengine - INFO - Epoch(train) [8][ 100/5758] lr: 7.2973e-04 eta: 9:00:33 time: 0.4223 data_time: 0.0014 memory: 20334 grad_norm: 0.0150 loss: 0.6873 2023/05/31 23:28:16 - mmengine - INFO - Epoch(train) [8][ 200/5758] lr: 7.2973e-04 eta: 8:59:43 time: 0.3914 data_time: 0.0015 memory: 20334 grad_norm: 0.0126 loss: 0.6861 2023/05/31 23:28:56 - mmengine - INFO - Epoch(train) [8][ 300/5758] lr: 7.2973e-04 eta: 8:58:55 time: 0.4302 data_time: 0.0018 memory: 20334 grad_norm: 0.0213 loss: 0.6850 2023/05/31 23:29:37 - mmengine - INFO - Epoch(train) [8][ 400/5758] lr: 7.2973e-04 eta: 8:58:06 time: 0.4036 data_time: 0.0015 memory: 20334 grad_norm: 0.0211 loss: 0.6903 2023/05/31 23:30:18 - mmengine - INFO - Epoch(train) [8][ 500/5758] lr: 7.2973e-04 eta: 8:57:20 time: 0.4276 data_time: 0.0016 memory: 20334 grad_norm: 0.0141 loss: 0.6874 2023/05/31 23:30:59 - mmengine - INFO - Epoch(train) [8][ 600/5758] lr: 7.2973e-04 eta: 8:56:31 time: 0.3991 data_time: 0.0015 memory: 20334 grad_norm: 0.0171 loss: 0.6926 2023/05/31 23:31:38 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 23:31:41 - mmengine - INFO - Epoch(train) [8][ 700/5758] lr: 7.2973e-04 eta: 8:55:45 time: 0.4612 data_time: 0.0016 memory: 20334 grad_norm: 0.0161 loss: 0.6854 2023/05/31 23:32:22 - mmengine - INFO - Epoch(train) [8][ 800/5758] lr: 7.2973e-04 eta: 8:54:59 time: 0.3963 data_time: 0.0015 memory: 20334 grad_norm: 0.0195 loss: 0.6854 2023/05/31 23:33:03 - mmengine - INFO - Epoch(train) [8][ 900/5758] lr: 7.2973e-04 eta: 8:54:10 time: 0.4091 data_time: 0.0015 memory: 20334 grad_norm: 0.0093 loss: 0.6858 2023/05/31 23:33:43 - mmengine - INFO - Epoch(train) [8][1000/5758] lr: 7.2973e-04 eta: 8:53:21 time: 0.3946 data_time: 0.0014 memory: 20334 grad_norm: 0.0173 loss: 0.6882 2023/05/31 23:34:22 - mmengine - INFO - Epoch(train) [8][1100/5758] lr: 7.2973e-04 eta: 8:52:31 time: 0.3819 data_time: 0.0015 memory: 20334 grad_norm: 0.0206 loss: 0.6901 2023/05/31 23:35:03 - mmengine - INFO - Epoch(train) [8][1200/5758] lr: 7.2973e-04 eta: 8:51:42 time: 0.3753 data_time: 0.0015 memory: 20334 grad_norm: 0.0139 loss: 0.6842 2023/05/31 23:35:43 - mmengine - INFO - Epoch(train) [8][1300/5758] lr: 7.2973e-04 eta: 8:50:53 time: 0.3834 data_time: 0.0014 memory: 20334 grad_norm: 0.0207 loss: 0.6848 2023/05/31 23:36:23 - mmengine - INFO - Epoch(train) [8][1400/5758] lr: 7.2973e-04 eta: 8:50:05 time: 0.4026 data_time: 0.0015 memory: 20334 grad_norm: 0.0194 loss: 0.6852 2023/05/31 23:37:05 - mmengine - INFO - Epoch(train) [8][1500/5758] lr: 7.2973e-04 eta: 8:49:19 time: 0.4379 data_time: 0.0015 memory: 20334 grad_norm: 0.0174 loss: 0.6905 2023/05/31 23:37:46 - mmengine - INFO - Epoch(train) [8][1600/5758] lr: 7.2973e-04 eta: 8:48:32 time: 0.4068 data_time: 0.0015 memory: 20334 grad_norm: 0.0078 loss: 0.6846 2023/05/31 23:38:25 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 23:38:27 - mmengine - INFO - Epoch(train) [8][1700/5758] lr: 7.2973e-04 eta: 8:47:45 time: 0.3627 data_time: 0.0016 memory: 20334 grad_norm: 0.0138 loss: 0.6831 2023/05/31 23:39:08 - mmengine - INFO - Epoch(train) [8][1800/5758] lr: 7.2973e-04 eta: 8:46:57 time: 0.4403 data_time: 0.0015 memory: 20334 grad_norm: 0.0195 loss: 0.6897 2023/05/31 23:39:49 - mmengine - INFO - Epoch(train) [8][1900/5758] lr: 7.2973e-04 eta: 8:46:09 time: 0.4031 data_time: 0.0015 memory: 20334 grad_norm: 0.0151 loss: 0.6837 2023/05/31 23:40:28 - mmengine - INFO - Epoch(train) [8][2000/5758] lr: 7.2973e-04 eta: 8:45:20 time: 0.4056 data_time: 0.0015 memory: 20334 grad_norm: 0.0154 loss: 0.6881 2023/05/31 23:41:10 - mmengine - INFO - Epoch(train) [8][2100/5758] lr: 7.2973e-04 eta: 8:44:33 time: 0.4114 data_time: 0.0015 memory: 20334 grad_norm: 0.0176 loss: 0.6852 2023/05/31 23:41:49 - mmengine - INFO - Epoch(train) [8][2200/5758] lr: 7.2973e-04 eta: 8:43:43 time: 0.4120 data_time: 0.0012 memory: 20334 grad_norm: 0.0169 loss: 0.6877 2023/05/31 23:42:30 - mmengine - INFO - Epoch(train) [8][2300/5758] lr: 7.2973e-04 eta: 8:42:56 time: 0.4703 data_time: 0.0013 memory: 20334 grad_norm: 0.0109 loss: 0.6843 2023/05/31 23:43:10 - mmengine - INFO - Epoch(train) [8][2400/5758] lr: 7.2973e-04 eta: 8:42:08 time: 0.4309 data_time: 0.0013 memory: 20334 grad_norm: 0.0137 loss: 0.6947 2023/05/31 23:43:50 - mmengine - INFO - Epoch(train) [8][2500/5758] lr: 7.2973e-04 eta: 8:41:19 time: 0.4007 data_time: 0.0012 memory: 20334 grad_norm: 0.0156 loss: 0.6836 2023/05/31 23:44:32 - mmengine - INFO - Epoch(train) [8][2600/5758] lr: 7.2973e-04 eta: 8:40:34 time: 0.4241 data_time: 0.0017 memory: 20334 grad_norm: 0.0155 loss: 0.6884 2023/05/31 23:45:12 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 23:45:14 - mmengine - INFO - Epoch(train) [8][2700/5758] lr: 7.2973e-04 eta: 8:39:48 time: 0.4685 data_time: 0.0017 memory: 20334 grad_norm: 0.0120 loss: 0.6840 2023/05/31 23:45:54 - mmengine - INFO - Epoch(train) [8][2800/5758] lr: 7.2973e-04 eta: 8:38:59 time: 0.4118 data_time: 0.0014 memory: 20334 grad_norm: 0.0096 loss: 0.6828 2023/05/31 23:46:35 - mmengine - INFO - Epoch(train) [8][2900/5758] lr: 7.2973e-04 eta: 8:38:12 time: 0.4031 data_time: 0.0019 memory: 20334 grad_norm: 0.0122 loss: 0.6836 2023/05/31 23:47:14 - mmengine - INFO - Epoch(train) [8][3000/5758] lr: 7.2973e-04 eta: 8:37:23 time: 0.3638 data_time: 0.0018 memory: 20334 grad_norm: 0.0161 loss: 0.6854 2023/05/31 23:47:54 - mmengine - INFO - Epoch(train) [8][3100/5758] lr: 7.2973e-04 eta: 8:36:33 time: 0.4331 data_time: 0.0020 memory: 20334 grad_norm: 0.0202 loss: 0.6902 2023/05/31 23:48:33 - mmengine - INFO - Epoch(train) [8][3200/5758] lr: 7.2973e-04 eta: 8:35:44 time: 0.3863 data_time: 0.0021 memory: 20334 grad_norm: 0.0199 loss: 0.6832 2023/05/31 23:49:14 - mmengine - INFO - Epoch(train) [8][3300/5758] lr: 7.2973e-04 eta: 8:34:57 time: 0.4603 data_time: 0.0018 memory: 20334 grad_norm: 0.0144 loss: 0.6889 2023/05/31 23:49:53 - mmengine - INFO - Epoch(train) [8][3400/5758] lr: 7.2973e-04 eta: 8:34:07 time: 0.3926 data_time: 0.0016 memory: 20334 grad_norm: 0.0170 loss: 0.6835 2023/05/31 23:50:34 - mmengine - INFO - Epoch(train) [8][3500/5758] lr: 7.2973e-04 eta: 8:33:19 time: 0.4093 data_time: 0.0018 memory: 20334 grad_norm: 0.0176 loss: 0.6804 2023/05/31 23:51:14 - mmengine - INFO - Epoch(train) [8][3600/5758] lr: 7.2973e-04 eta: 8:32:32 time: 0.4041 data_time: 0.0016 memory: 20334 grad_norm: 0.0166 loss: 0.6864 2023/05/31 23:51:51 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 23:51:53 - mmengine - INFO - Epoch(train) [8][3700/5758] lr: 7.2973e-04 eta: 8:31:42 time: 0.3867 data_time: 0.0019 memory: 20334 grad_norm: 0.0128 loss: 0.6867 2023/05/31 23:52:33 - mmengine - INFO - Epoch(train) [8][3800/5758] lr: 7.2973e-04 eta: 8:30:53 time: 0.4175 data_time: 0.0023 memory: 20334 grad_norm: 0.0122 loss: 0.6845 2023/05/31 23:53:12 - mmengine - INFO - Epoch(train) [8][3900/5758] lr: 7.2973e-04 eta: 8:30:04 time: 0.3992 data_time: 0.0015 memory: 20334 grad_norm: 0.0148 loss: 0.6846 2023/05/31 23:53:52 - mmengine - INFO - Epoch(train) [8][4000/5758] lr: 7.2973e-04 eta: 8:29:15 time: 0.4085 data_time: 0.0017 memory: 20334 grad_norm: 0.0194 loss: 0.6884 2023/05/31 23:54:32 - mmengine - INFO - Epoch(train) [8][4100/5758] lr: 7.2973e-04 eta: 8:28:27 time: 0.4341 data_time: 0.0019 memory: 20334 grad_norm: 0.0218 loss: 0.6902 2023/05/31 23:55:12 - mmengine - INFO - Epoch(train) [8][4200/5758] lr: 7.2973e-04 eta: 8:27:39 time: 0.4425 data_time: 0.0026 memory: 20334 grad_norm: 0.0092 loss: 0.6831 2023/05/31 23:55:53 - mmengine - INFO - Epoch(train) [8][4300/5758] lr: 7.2973e-04 eta: 8:26:52 time: 0.3826 data_time: 0.0031 memory: 20334 grad_norm: 0.0142 loss: 0.6839 2023/05/31 23:56:32 - mmengine - INFO - Epoch(train) [8][4400/5758] lr: 7.2973e-04 eta: 8:26:03 time: 0.3903 data_time: 0.0020 memory: 20334 grad_norm: 0.0141 loss: 0.6895 2023/05/31 23:57:11 - mmengine - INFO - Epoch(train) [8][4500/5758] lr: 7.2973e-04 eta: 8:25:14 time: 0.4005 data_time: 0.0021 memory: 20334 grad_norm: 0.0095 loss: 0.6898 2023/05/31 23:57:51 - mmengine - INFO - Epoch(train) [8][4600/5758] lr: 7.2973e-04 eta: 8:24:25 time: 0.3952 data_time: 0.0024 memory: 20334 grad_norm: 0.0182 loss: 0.6843 2023/05/31 23:58:29 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/05/31 23:58:31 - mmengine - INFO - Epoch(train) [8][4700/5758] lr: 7.2973e-04 eta: 8:23:39 time: 0.3863 data_time: 0.0017 memory: 20334 grad_norm: 0.0133 loss: 0.6857 2023/05/31 23:59:12 - mmengine - INFO - Epoch(train) [8][4800/5758] lr: 7.2973e-04 eta: 8:22:52 time: 0.4085 data_time: 0.0020 memory: 20334 grad_norm: 0.0106 loss: 0.6831 2023/05/31 23:59:52 - mmengine - INFO - Epoch(train) [8][4900/5758] lr: 7.2973e-04 eta: 8:22:04 time: 0.3887 data_time: 0.0021 memory: 20334 grad_norm: 0.0196 loss: 0.6865 2023/06/01 00:00:34 - mmengine - INFO - Epoch(train) [8][5000/5758] lr: 7.2973e-04 eta: 8:21:19 time: 0.4114 data_time: 0.0019 memory: 20334 grad_norm: 0.0168 loss: 0.6831 2023/06/01 00:01:13 - mmengine - INFO - Epoch(train) [8][5100/5758] lr: 7.2973e-04 eta: 8:20:30 time: 0.4419 data_time: 0.0017 memory: 20334 grad_norm: 0.0174 loss: 0.6849 2023/06/01 00:01:53 - mmengine - INFO - Epoch(train) [8][5200/5758] lr: 7.2973e-04 eta: 8:19:42 time: 0.4285 data_time: 0.0015 memory: 20334 grad_norm: 0.0248 loss: 0.6894 2023/06/01 00:02:33 - mmengine - INFO - Epoch(train) [8][5300/5758] lr: 7.2973e-04 eta: 8:18:55 time: 0.3812 data_time: 0.0017 memory: 20334 grad_norm: 0.0194 loss: 0.6841 2023/06/01 00:03:13 - mmengine - INFO - Epoch(train) [8][5400/5758] lr: 7.2973e-04 eta: 8:18:07 time: 0.4205 data_time: 0.0017 memory: 20334 grad_norm: 0.0108 loss: 0.6873 2023/06/01 00:03:53 - mmengine - INFO - Epoch(train) [8][5500/5758] lr: 7.2973e-04 eta: 8:17:19 time: 0.4115 data_time: 0.0016 memory: 20334 grad_norm: 0.0099 loss: 0.6845 2023/06/01 00:04:33 - mmengine - INFO - Epoch(train) [8][5600/5758] lr: 7.2973e-04 eta: 8:16:31 time: 0.3997 data_time: 0.0018 memory: 20334 grad_norm: 0.0236 loss: 0.6868 2023/06/01 00:05:10 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 00:05:12 - mmengine - INFO - Epoch(train) [8][5700/5758] lr: 7.2973e-04 eta: 8:15:43 time: 0.3803 data_time: 0.0017 memory: 20334 grad_norm: 0.0141 loss: 0.6926 2023/06/01 00:05:35 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 00:05:35 - mmengine - INFO - Saving checkpoint at 8 epochs 2023/06/01 00:05:52 - mmengine - INFO - Epoch(val) [8][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.3851 time: 1.0094 2023/06/01 00:06:37 - mmengine - INFO - Epoch(train) [9][ 100/5758] lr: 6.5796e-04 eta: 8:14:34 time: 0.3657 data_time: 0.0016 memory: 20334 grad_norm: 0.0142 loss: 0.6866 2023/06/01 00:07:16 - mmengine - INFO - Epoch(train) [9][ 200/5758] lr: 6.5796e-04 eta: 8:13:46 time: 0.4105 data_time: 0.0016 memory: 20334 grad_norm: 0.0199 loss: 0.6880 2023/06/01 00:07:57 - mmengine - INFO - Epoch(train) [9][ 300/5758] lr: 6.5796e-04 eta: 8:12:59 time: 0.3799 data_time: 0.0017 memory: 20334 grad_norm: 0.0085 loss: 0.6842 2023/06/01 00:08:36 - mmengine - INFO - Epoch(train) [9][ 400/5758] lr: 6.5796e-04 eta: 8:12:11 time: 0.3870 data_time: 0.0016 memory: 20334 grad_norm: 0.0180 loss: 0.6870 2023/06/01 00:09:17 - mmengine - INFO - Epoch(train) [9][ 500/5758] lr: 6.5796e-04 eta: 8:11:24 time: 0.4097 data_time: 0.0016 memory: 20334 grad_norm: 0.0165 loss: 0.6872 2023/06/01 00:09:56 - mmengine - INFO - Epoch(train) [9][ 600/5758] lr: 6.5796e-04 eta: 8:10:36 time: 0.4078 data_time: 0.0015 memory: 20334 grad_norm: 0.0171 loss: 0.6877 2023/06/01 00:10:37 - mmengine - INFO - Epoch(train) [9][ 700/5758] lr: 6.5796e-04 eta: 8:09:50 time: 0.4025 data_time: 0.0015 memory: 20334 grad_norm: 0.0262 loss: 0.6899 2023/06/01 00:11:17 - mmengine - INFO - Epoch(train) [9][ 800/5758] lr: 6.5796e-04 eta: 8:09:03 time: 0.4089 data_time: 0.0016 memory: 20334 grad_norm: 0.0134 loss: 0.6825 2023/06/01 00:11:59 - mmengine - INFO - Epoch(train) [9][ 900/5758] lr: 6.5796e-04 eta: 8:08:18 time: 0.4349 data_time: 0.0017 memory: 20334 grad_norm: 0.0166 loss: 0.6853 2023/06/01 00:12:14 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 00:12:39 - mmengine - INFO - Epoch(train) [9][1000/5758] lr: 6.5796e-04 eta: 8:07:31 time: 0.4156 data_time: 0.0016 memory: 20334 grad_norm: 0.0297 loss: 0.6826 2023/06/01 00:13:18 - mmengine - INFO - Epoch(train) [9][1100/5758] lr: 6.5796e-04 eta: 8:06:43 time: 0.4004 data_time: 0.0017 memory: 20334 grad_norm: 0.0241 loss: 0.6842 2023/06/01 00:13:58 - mmengine - INFO - Epoch(train) [9][1200/5758] lr: 6.5796e-04 eta: 8:05:55 time: 0.3890 data_time: 0.0015 memory: 20334 grad_norm: 0.0166 loss: 0.6880 2023/06/01 00:14:39 - mmengine - INFO - Epoch(train) [9][1300/5758] lr: 6.5796e-04 eta: 8:05:09 time: 0.4182 data_time: 0.0019 memory: 20334 grad_norm: 0.0172 loss: 0.6901 2023/06/01 00:15:19 - mmengine - INFO - Epoch(train) [9][1400/5758] lr: 6.5796e-04 eta: 8:04:22 time: 0.3768 data_time: 0.0018 memory: 20334 grad_norm: 0.0236 loss: 0.6926 2023/06/01 00:16:00 - mmengine - INFO - Epoch(train) [9][1500/5758] lr: 6.5796e-04 eta: 8:03:36 time: 0.3780 data_time: 0.0016 memory: 20334 grad_norm: 0.0203 loss: 0.6883 2023/06/01 00:16:40 - mmengine - INFO - Epoch(train) [9][1600/5758] lr: 6.5796e-04 eta: 8:02:49 time: 0.4377 data_time: 0.0018 memory: 20334 grad_norm: 0.0207 loss: 0.6895 2023/06/01 00:17:19 - mmengine - INFO - Epoch(train) [9][1700/5758] lr: 6.5796e-04 eta: 8:02:01 time: 0.4236 data_time: 0.0015 memory: 20334 grad_norm: 0.0263 loss: 0.6902 2023/06/01 00:17:58 - mmengine - INFO - Epoch(train) [9][1800/5758] lr: 6.5796e-04 eta: 8:01:12 time: 0.3900 data_time: 0.0015 memory: 20334 grad_norm: 0.0167 loss: 0.6927 2023/06/01 00:18:38 - mmengine - INFO - Epoch(train) [9][1900/5758] lr: 6.5796e-04 eta: 8:00:25 time: 0.4081 data_time: 0.0014 memory: 20334 grad_norm: 0.0168 loss: 0.6866 2023/06/01 00:18:53 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 00:19:18 - mmengine - INFO - Epoch(train) [9][2000/5758] lr: 6.5796e-04 eta: 7:59:39 time: 0.4291 data_time: 0.0015 memory: 20334 grad_norm: 0.0119 loss: 0.6865 2023/06/01 00:19:57 - mmengine - INFO - Epoch(train) [9][2100/5758] lr: 6.5796e-04 eta: 7:58:50 time: 0.3696 data_time: 0.0015 memory: 20334 grad_norm: 0.0185 loss: 0.6884 2023/06/01 00:20:38 - mmengine - INFO - Epoch(train) [9][2200/5758] lr: 6.5796e-04 eta: 7:58:04 time: 0.4319 data_time: 0.0014 memory: 20334 grad_norm: 0.0166 loss: 0.6853 2023/06/01 00:21:18 - mmengine - INFO - Epoch(train) [9][2300/5758] lr: 6.5796e-04 eta: 7:57:17 time: 0.4174 data_time: 0.0015 memory: 20334 grad_norm: 0.0148 loss: 0.6844 2023/06/01 00:21:58 - mmengine - INFO - Epoch(train) [9][2400/5758] lr: 6.5796e-04 eta: 7:56:30 time: 0.4347 data_time: 0.0015 memory: 20334 grad_norm: 0.0178 loss: 0.6882 2023/06/01 00:22:38 - mmengine - INFO - Epoch(train) [9][2500/5758] lr: 6.5796e-04 eta: 7:55:44 time: 0.3834 data_time: 0.0017 memory: 20334 grad_norm: 0.0168 loss: 0.6823 2023/06/01 00:23:18 - mmengine - INFO - Epoch(train) [9][2600/5758] lr: 6.5796e-04 eta: 7:54:57 time: 0.3810 data_time: 0.0017 memory: 20334 grad_norm: 0.0126 loss: 0.6860 2023/06/01 00:23:58 - mmengine - INFO - Epoch(train) [9][2700/5758] lr: 6.5796e-04 eta: 7:54:10 time: 0.3961 data_time: 0.0017 memory: 20334 grad_norm: 0.0197 loss: 0.6837 2023/06/01 00:24:39 - mmengine - INFO - Epoch(train) [9][2800/5758] lr: 6.5796e-04 eta: 7:53:26 time: 0.4435 data_time: 0.0018 memory: 20334 grad_norm: 0.0157 loss: 0.6875 2023/06/01 00:25:19 - mmengine - INFO - Epoch(train) [9][2900/5758] lr: 6.5796e-04 eta: 7:52:38 time: 0.3820 data_time: 0.0018 memory: 20334 grad_norm: 0.0180 loss: 0.6824 2023/06/01 00:25:33 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 00:25:59 - mmengine - INFO - Epoch(train) [9][3000/5758] lr: 6.5796e-04 eta: 7:51:51 time: 0.4053 data_time: 0.0016 memory: 20334 grad_norm: 0.0163 loss: 0.6910 2023/06/01 00:26:38 - mmengine - INFO - Epoch(train) [9][3100/5758] lr: 6.5796e-04 eta: 7:51:04 time: 0.3820 data_time: 0.0016 memory: 20334 grad_norm: 0.0174 loss: 0.6902 2023/06/01 00:27:18 - mmengine - INFO - Epoch(train) [9][3200/5758] lr: 6.5796e-04 eta: 7:50:18 time: 0.4061 data_time: 0.0018 memory: 20334 grad_norm: 0.0130 loss: 0.6888 2023/06/01 00:27:58 - mmengine - INFO - Epoch(train) [9][3300/5758] lr: 6.5796e-04 eta: 7:49:31 time: 0.3685 data_time: 0.0017 memory: 20334 grad_norm: 0.0157 loss: 0.6879 2023/06/01 00:28:39 - mmengine - INFO - Epoch(train) [9][3400/5758] lr: 6.5796e-04 eta: 7:48:45 time: 0.4373 data_time: 0.0017 memory: 20334 grad_norm: 0.0146 loss: 0.6809 2023/06/01 00:29:19 - mmengine - INFO - Epoch(train) [9][3500/5758] lr: 6.5796e-04 eta: 7:47:59 time: 0.3634 data_time: 0.0019 memory: 20334 grad_norm: 0.0162 loss: 0.6866 2023/06/01 00:29:59 - mmengine - INFO - Epoch(train) [9][3600/5758] lr: 6.5796e-04 eta: 7:47:13 time: 0.3802 data_time: 0.0017 memory: 20334 grad_norm: 0.0205 loss: 0.6849 2023/06/01 00:30:39 - mmengine - INFO - Epoch(train) [9][3700/5758] lr: 6.5796e-04 eta: 7:46:25 time: 0.3969 data_time: 0.0016 memory: 20334 grad_norm: 0.0171 loss: 0.6884 2023/06/01 00:31:19 - mmengine - INFO - Epoch(train) [9][3800/5758] lr: 6.5796e-04 eta: 7:45:40 time: 0.4089 data_time: 0.0017 memory: 20334 grad_norm: 0.0101 loss: 0.6838 2023/06/01 00:31:59 - mmengine - INFO - Epoch(train) [9][3900/5758] lr: 6.5796e-04 eta: 7:44:53 time: 0.3888 data_time: 0.0018 memory: 20334 grad_norm: 0.0148 loss: 0.6896 2023/06/01 00:32:14 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 00:32:40 - mmengine - INFO - Epoch(train) [9][4000/5758] lr: 6.5796e-04 eta: 7:44:08 time: 0.4657 data_time: 0.0015 memory: 20334 grad_norm: 0.0133 loss: 0.6859 2023/06/01 00:33:21 - mmengine - INFO - Epoch(train) [9][4100/5758] lr: 6.5796e-04 eta: 7:43:22 time: 0.4193 data_time: 0.0016 memory: 20334 grad_norm: 0.0126 loss: 0.6846 2023/06/01 00:34:01 - mmengine - INFO - Epoch(train) [9][4200/5758] lr: 6.5796e-04 eta: 7:42:36 time: 0.4574 data_time: 0.0016 memory: 20334 grad_norm: 0.0164 loss: 0.6868 2023/06/01 00:34:40 - mmengine - INFO - Epoch(train) [9][4300/5758] lr: 6.5796e-04 eta: 7:41:49 time: 0.4001 data_time: 0.0016 memory: 20334 grad_norm: 0.0199 loss: 0.6844 2023/06/01 00:35:20 - mmengine - INFO - Epoch(train) [9][4400/5758] lr: 6.5796e-04 eta: 7:41:03 time: 0.4158 data_time: 0.0014 memory: 20334 grad_norm: 0.0163 loss: 0.6905 2023/06/01 00:36:00 - mmengine - INFO - Epoch(train) [9][4500/5758] lr: 6.5796e-04 eta: 7:40:16 time: 0.4090 data_time: 0.0013 memory: 20334 grad_norm: 0.0154 loss: 0.6873 2023/06/01 00:36:40 - mmengine - INFO - Epoch(train) [9][4600/5758] lr: 6.5796e-04 eta: 7:39:30 time: 0.4068 data_time: 0.0014 memory: 20334 grad_norm: 0.0163 loss: 0.6849 2023/06/01 00:37:20 - mmengine - INFO - Epoch(train) [9][4700/5758] lr: 6.5796e-04 eta: 7:38:43 time: 0.3742 data_time: 0.0013 memory: 20334 grad_norm: 0.0120 loss: 0.6895 2023/06/01 00:38:00 - mmengine - INFO - Epoch(train) [9][4800/5758] lr: 6.5796e-04 eta: 7:37:57 time: 0.3626 data_time: 0.0013 memory: 20334 grad_norm: 0.0179 loss: 0.6838 2023/06/01 00:38:40 - mmengine - INFO - Epoch(train) [9][4900/5758] lr: 6.5796e-04 eta: 7:37:11 time: 0.4155 data_time: 0.0016 memory: 20334 grad_norm: 0.0104 loss: 0.6882 2023/06/01 00:38:55 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 00:39:21 - mmengine - INFO - Epoch(train) [9][5000/5758] lr: 6.5796e-04 eta: 7:36:26 time: 0.4163 data_time: 0.0016 memory: 20334 grad_norm: 0.0163 loss: 0.6855 2023/06/01 00:40:00 - mmengine - INFO - Epoch(train) [9][5100/5758] lr: 6.5796e-04 eta: 7:35:39 time: 0.3880 data_time: 0.0014 memory: 20334 grad_norm: 0.0136 loss: 0.6881 2023/06/01 00:40:40 - mmengine - INFO - Epoch(train) [9][5200/5758] lr: 6.5796e-04 eta: 7:34:53 time: 0.3931 data_time: 0.0015 memory: 20334 grad_norm: 0.0151 loss: 0.6864 2023/06/01 00:41:20 - mmengine - INFO - Epoch(train) [9][5300/5758] lr: 6.5796e-04 eta: 7:34:06 time: 0.3680 data_time: 0.0015 memory: 20334 grad_norm: 0.0214 loss: 0.6849 2023/06/01 00:42:00 - mmengine - INFO - Epoch(train) [9][5400/5758] lr: 6.5796e-04 eta: 7:33:20 time: 0.4489 data_time: 0.0017 memory: 20334 grad_norm: 0.0175 loss: 0.6825 2023/06/01 00:42:39 - mmengine - INFO - Epoch(train) [9][5500/5758] lr: 6.5796e-04 eta: 7:32:33 time: 0.4165 data_time: 0.0017 memory: 20334 grad_norm: 0.0185 loss: 0.6859 2023/06/01 00:43:18 - mmengine - INFO - Epoch(train) [9][5600/5758] lr: 6.5796e-04 eta: 7:31:46 time: 0.3864 data_time: 0.0018 memory: 20334 grad_norm: 0.0111 loss: 0.6867 2023/06/01 00:43:58 - mmengine - INFO - Epoch(train) [9][5700/5758] lr: 6.5796e-04 eta: 7:31:00 time: 0.3709 data_time: 0.0015 memory: 20334 grad_norm: 0.0168 loss: 0.6819 2023/06/01 00:44:21 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 00:44:21 - mmengine - INFO - Saving checkpoint at 9 epochs 2023/06/01 00:44:38 - mmengine - INFO - Epoch(val) [9][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.3715 time: 0.9948 2023/06/01 00:45:22 - mmengine - INFO - Epoch(train) [10][ 100/5758] lr: 5.8244e-04 eta: 7:29:53 time: 0.4262 data_time: 0.0016 memory: 20334 grad_norm: 0.0136 loss: 0.6862 2023/06/01 00:45:54 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 00:46:03 - mmengine - INFO - Epoch(train) [10][ 200/5758] lr: 5.8244e-04 eta: 7:29:07 time: 0.3890 data_time: 0.0016 memory: 20334 grad_norm: 0.0158 loss: 0.6849 2023/06/01 00:46:41 - mmengine - INFO - Epoch(train) [10][ 300/5758] lr: 5.8244e-04 eta: 7:28:19 time: 0.4332 data_time: 0.0018 memory: 20334 grad_norm: 0.0155 loss: 0.6848 2023/06/01 00:47:20 - mmengine - INFO - Epoch(train) [10][ 400/5758] lr: 5.8244e-04 eta: 7:27:32 time: 0.3700 data_time: 0.0018 memory: 20334 grad_norm: 0.0152 loss: 0.6876 2023/06/01 00:48:00 - mmengine - INFO - Epoch(train) [10][ 500/5758] lr: 5.8244e-04 eta: 7:26:47 time: 0.4105 data_time: 0.0016 memory: 20334 grad_norm: 0.0105 loss: 0.6874 2023/06/01 00:48:41 - mmengine - INFO - Epoch(train) [10][ 600/5758] lr: 5.8244e-04 eta: 7:26:01 time: 0.4195 data_time: 0.0016 memory: 20334 grad_norm: 0.0116 loss: 0.6903 2023/06/01 00:49:20 - mmengine - INFO - Epoch(train) [10][ 700/5758] lr: 5.8244e-04 eta: 7:25:15 time: 0.3901 data_time: 0.0018 memory: 20334 grad_norm: 0.0214 loss: 0.6843 2023/06/01 00:50:00 - mmengine - INFO - Epoch(train) [10][ 800/5758] lr: 5.8244e-04 eta: 7:24:29 time: 0.4323 data_time: 0.0014 memory: 20334 grad_norm: 0.0161 loss: 0.6893 2023/06/01 00:50:39 - mmengine - INFO - Epoch(train) [10][ 900/5758] lr: 5.8244e-04 eta: 7:23:42 time: 0.3841 data_time: 0.0017 memory: 20334 grad_norm: 0.0155 loss: 0.6848 2023/06/01 00:51:19 - mmengine - INFO - Epoch(train) [10][1000/5758] lr: 5.8244e-04 eta: 7:22:56 time: 0.3833 data_time: 0.0019 memory: 20334 grad_norm: 0.0183 loss: 0.6794 2023/06/01 00:51:59 - mmengine - INFO - Epoch(train) [10][1100/5758] lr: 5.8244e-04 eta: 7:22:10 time: 0.3641 data_time: 0.0018 memory: 20334 grad_norm: 0.0127 loss: 0.6832 2023/06/01 00:52:31 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 00:52:39 - mmengine - INFO - Epoch(train) [10][1200/5758] lr: 5.8244e-04 eta: 7:21:25 time: 0.3908 data_time: 0.0017 memory: 20334 grad_norm: 0.0186 loss: 0.6863 2023/06/01 00:53:20 - mmengine - INFO - Epoch(train) [10][1300/5758] lr: 5.8244e-04 eta: 7:20:40 time: 0.4295 data_time: 0.0015 memory: 20334 grad_norm: 0.0171 loss: 0.6921 2023/06/01 00:53:58 - mmengine - INFO - Epoch(train) [10][1400/5758] lr: 5.8244e-04 eta: 7:19:52 time: 0.3710 data_time: 0.0017 memory: 20334 grad_norm: 0.0203 loss: 0.6852 2023/06/01 00:54:38 - mmengine - INFO - Epoch(train) [10][1500/5758] lr: 5.8244e-04 eta: 7:19:06 time: 0.3865 data_time: 0.0018 memory: 20334 grad_norm: 0.0182 loss: 0.6863 2023/06/01 00:55:18 - mmengine - INFO - Epoch(train) [10][1600/5758] lr: 5.8244e-04 eta: 7:18:21 time: 0.4194 data_time: 0.0015 memory: 20334 grad_norm: 0.0177 loss: 0.6906 2023/06/01 00:55:58 - mmengine - INFO - Epoch(train) [10][1700/5758] lr: 5.8244e-04 eta: 7:17:35 time: 0.4043 data_time: 0.0015 memory: 20334 grad_norm: 0.0141 loss: 0.6836 2023/06/01 00:56:38 - mmengine - INFO - Epoch(train) [10][1800/5758] lr: 5.8244e-04 eta: 7:16:50 time: 0.3795 data_time: 0.0016 memory: 20334 grad_norm: 0.0189 loss: 0.6863 2023/06/01 00:57:18 - mmengine - INFO - Epoch(train) [10][1900/5758] lr: 5.8244e-04 eta: 7:16:04 time: 0.4205 data_time: 0.0014 memory: 20334 grad_norm: 0.0117 loss: 0.6885 2023/06/01 00:57:59 - mmengine - INFO - Epoch(train) [10][2000/5758] lr: 5.8244e-04 eta: 7:15:19 time: 0.4040 data_time: 0.0016 memory: 20334 grad_norm: 0.0186 loss: 0.6876 2023/06/01 00:58:38 - mmengine - INFO - Epoch(train) [10][2100/5758] lr: 5.8244e-04 eta: 7:14:33 time: 0.3708 data_time: 0.0015 memory: 20334 grad_norm: 0.0202 loss: 0.6900 2023/06/01 00:59:09 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 00:59:17 - mmengine - INFO - Epoch(train) [10][2200/5758] lr: 5.8244e-04 eta: 7:13:46 time: 0.4004 data_time: 0.0016 memory: 20334 grad_norm: 0.0131 loss: 0.6886 2023/06/01 00:59:56 - mmengine - INFO - Epoch(train) [10][2300/5758] lr: 5.8244e-04 eta: 7:13:00 time: 0.3760 data_time: 0.0019 memory: 20334 grad_norm: 0.0103 loss: 0.6827 2023/06/01 01:00:37 - mmengine - INFO - Epoch(train) [10][2400/5758] lr: 5.8244e-04 eta: 7:12:15 time: 0.4136 data_time: 0.0017 memory: 20334 grad_norm: 0.0192 loss: 0.6844 2023/06/01 01:01:17 - mmengine - INFO - Epoch(train) [10][2500/5758] lr: 5.8244e-04 eta: 7:11:30 time: 0.4294 data_time: 0.0020 memory: 20334 grad_norm: 0.0161 loss: 0.6878 2023/06/01 01:01:57 - mmengine - INFO - Epoch(train) [10][2600/5758] lr: 5.8244e-04 eta: 7:10:44 time: 0.4233 data_time: 0.0013 memory: 20334 grad_norm: 0.0128 loss: 0.6899 2023/06/01 01:02:36 - mmengine - INFO - Epoch(train) [10][2700/5758] lr: 5.8244e-04 eta: 7:09:58 time: 0.4120 data_time: 0.0015 memory: 20334 grad_norm: 0.0128 loss: 0.6877 2023/06/01 01:03:15 - mmengine - INFO - Epoch(train) [10][2800/5758] lr: 5.8244e-04 eta: 7:09:12 time: 0.4175 data_time: 0.0019 memory: 20334 grad_norm: 0.0157 loss: 0.6882 2023/06/01 01:03:55 - mmengine - INFO - Epoch(train) [10][2900/5758] lr: 5.8244e-04 eta: 7:08:26 time: 0.4090 data_time: 0.0015 memory: 20334 grad_norm: 0.0175 loss: 0.6817 2023/06/01 01:04:34 - mmengine - INFO - Epoch(train) [10][3000/5758] lr: 5.8244e-04 eta: 7:07:40 time: 0.3908 data_time: 0.0016 memory: 20334 grad_norm: 0.0069 loss: 0.6842 2023/06/01 01:05:13 - mmengine - INFO - Epoch(train) [10][3100/5758] lr: 5.8244e-04 eta: 7:06:54 time: 0.4035 data_time: 0.0017 memory: 20334 grad_norm: 0.0181 loss: 0.6884 2023/06/01 01:05:46 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 01:05:54 - mmengine - INFO - Epoch(train) [10][3200/5758] lr: 5.8244e-04 eta: 7:06:09 time: 0.3929 data_time: 0.0016 memory: 20334 grad_norm: 0.0088 loss: 0.6831 2023/06/01 01:06:34 - mmengine - INFO - Epoch(train) [10][3300/5758] lr: 5.8244e-04 eta: 7:05:23 time: 0.4198 data_time: 0.0016 memory: 20334 grad_norm: 0.0117 loss: 0.6844 2023/06/01 01:08:24 - mmengine - INFO - Epoch(train) [10][3400/5758] lr: 5.8244e-04 eta: 7:05:54 time: 0.4055 data_time: 0.0017 memory: 20334 grad_norm: 0.0115 loss: 0.6901 2023/06/01 01:09:02 - mmengine - INFO - Epoch(train) [10][3500/5758] lr: 5.8244e-04 eta: 7:05:07 time: 0.3676 data_time: 0.0017 memory: 20334 grad_norm: 0.0141 loss: 0.6913 2023/06/01 01:09:41 - mmengine - INFO - Epoch(train) [10][3600/5758] lr: 5.8244e-04 eta: 7:04:20 time: 0.3884 data_time: 0.0018 memory: 20334 grad_norm: 0.0159 loss: 0.6866 2023/06/01 01:10:21 - mmengine - INFO - Epoch(train) [10][3700/5758] lr: 5.8244e-04 eta: 7:03:35 time: 0.4087 data_time: 0.0018 memory: 20334 grad_norm: 0.0216 loss: 0.6877 2023/06/01 01:11:01 - mmengine - INFO - Epoch(train) [10][3800/5758] lr: 5.8244e-04 eta: 7:02:49 time: 0.4177 data_time: 0.0016 memory: 20334 grad_norm: 0.0146 loss: 0.6817 2023/06/01 01:11:41 - mmengine - INFO - Epoch(train) [10][3900/5758] lr: 5.8244e-04 eta: 7:02:04 time: 0.3804 data_time: 0.0019 memory: 20334 grad_norm: 0.0110 loss: 0.6878 2023/06/01 01:12:21 - mmengine - INFO - Epoch(train) [10][4000/5758] lr: 5.8244e-04 eta: 7:01:18 time: 0.3712 data_time: 0.0016 memory: 20334 grad_norm: 0.0101 loss: 0.6867 2023/06/01 01:13:02 - mmengine - INFO - Epoch(train) [10][4100/5758] lr: 5.8244e-04 eta: 7:00:34 time: 0.3736 data_time: 0.0019 memory: 20334 grad_norm: 0.0115 loss: 0.6877 2023/06/01 01:13:32 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 01:13:41 - mmengine - INFO - Epoch(train) [10][4200/5758] lr: 5.8244e-04 eta: 6:59:48 time: 0.3945 data_time: 0.0018 memory: 20334 grad_norm: 0.0124 loss: 0.6830 2023/06/01 01:14:21 - mmengine - INFO - Epoch(train) [10][4300/5758] lr: 5.8244e-04 eta: 6:59:02 time: 0.3841 data_time: 0.0020 memory: 20334 grad_norm: 0.0208 loss: 0.6918 2023/06/01 01:15:00 - mmengine - INFO - Epoch(train) [10][4400/5758] lr: 5.8244e-04 eta: 6:58:16 time: 0.4067 data_time: 0.0017 memory: 20334 grad_norm: 0.0135 loss: 0.6818 2023/06/01 01:15:39 - mmengine - INFO - Epoch(train) [10][4500/5758] lr: 5.8244e-04 eta: 6:57:30 time: 0.3725 data_time: 0.0019 memory: 20334 grad_norm: 0.0127 loss: 0.6808 2023/06/01 01:16:20 - mmengine - INFO - Epoch(train) [10][4600/5758] lr: 5.8244e-04 eta: 6:56:45 time: 0.3779 data_time: 0.0015 memory: 20334 grad_norm: 0.0210 loss: 0.6907 2023/06/01 01:17:01 - mmengine - INFO - Epoch(train) [10][4700/5758] lr: 5.8244e-04 eta: 6:56:02 time: 0.4157 data_time: 0.0017 memory: 20334 grad_norm: 0.0121 loss: 0.6793 2023/06/01 01:17:40 - mmengine - INFO - Epoch(train) [10][4800/5758] lr: 5.8244e-04 eta: 6:55:15 time: 0.3864 data_time: 0.0017 memory: 20334 grad_norm: 0.0156 loss: 0.6881 2023/06/01 01:18:20 - mmengine - INFO - Epoch(train) [10][4900/5758] lr: 5.8244e-04 eta: 6:54:30 time: 0.4160 data_time: 0.0017 memory: 20334 grad_norm: 0.0154 loss: 0.6878 2023/06/01 01:19:00 - mmengine - INFO - Epoch(train) [10][5000/5758] lr: 5.8244e-04 eta: 6:53:44 time: 0.4001 data_time: 0.0017 memory: 20334 grad_norm: 0.0165 loss: 0.6827 2023/06/01 01:19:40 - mmengine - INFO - Epoch(train) [10][5100/5758] lr: 5.8244e-04 eta: 6:52:59 time: 0.3830 data_time: 0.0016 memory: 20334 grad_norm: 0.0135 loss: 0.6873 2023/06/01 01:20:12 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 01:20:21 - mmengine - INFO - Epoch(train) [10][5200/5758] lr: 5.8244e-04 eta: 6:52:16 time: 0.4249 data_time: 0.0016 memory: 20334 grad_norm: 0.0196 loss: 0.6893 2023/06/01 01:21:01 - mmengine - INFO - Epoch(train) [10][5300/5758] lr: 5.8244e-04 eta: 6:51:31 time: 0.4225 data_time: 0.0016 memory: 20334 grad_norm: 0.0183 loss: 0.6847 2023/06/01 01:21:42 - mmengine - INFO - Epoch(train) [10][5400/5758] lr: 5.8244e-04 eta: 6:50:46 time: 0.3960 data_time: 0.0016 memory: 20334 grad_norm: 0.0171 loss: 0.6869 2023/06/01 01:22:22 - mmengine - INFO - Epoch(train) [10][5500/5758] lr: 5.8244e-04 eta: 6:50:01 time: 0.4215 data_time: 0.0019 memory: 20334 grad_norm: 0.0119 loss: 0.6865 2023/06/01 01:23:01 - mmengine - INFO - Epoch(train) [10][5600/5758] lr: 5.8244e-04 eta: 6:49:15 time: 0.3706 data_time: 0.0018 memory: 20334 grad_norm: 0.0249 loss: 0.6873 2023/06/01 01:23:41 - mmengine - INFO - Epoch(train) [10][5700/5758] lr: 5.8244e-04 eta: 6:48:30 time: 0.4217 data_time: 0.0019 memory: 20334 grad_norm: 0.0146 loss: 0.6857 2023/06/01 01:24:04 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 01:24:04 - mmengine - INFO - Saving checkpoint at 10 epochs 2023/06/01 01:24:21 - mmengine - INFO - Epoch(val) [10][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.3693 time: 1.0086 2023/06/01 01:25:05 - mmengine - INFO - Epoch(train) [11][ 100/5758] lr: 5.0500e-04 eta: 6:47:23 time: 0.4090 data_time: 0.0017 memory: 20334 grad_norm: 0.0156 loss: 0.6873 2023/06/01 01:25:45 - mmengine - INFO - Epoch(train) [11][ 200/5758] lr: 5.0500e-04 eta: 6:46:38 time: 0.3714 data_time: 0.0017 memory: 20334 grad_norm: 0.0134 loss: 0.6865 2023/06/01 01:26:25 - mmengine - INFO - Epoch(train) [11][ 300/5758] lr: 5.0500e-04 eta: 6:45:53 time: 0.3854 data_time: 0.0017 memory: 20334 grad_norm: 0.0179 loss: 0.6903 2023/06/01 01:27:06 - mmengine - INFO - Epoch(train) [11][ 400/5758] lr: 5.0500e-04 eta: 6:45:09 time: 0.4526 data_time: 0.0016 memory: 20334 grad_norm: 0.0126 loss: 0.6907 2023/06/01 01:27:14 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 01:27:46 - mmengine - INFO - Epoch(train) [11][ 500/5758] lr: 5.0500e-04 eta: 6:44:24 time: 0.4103 data_time: 0.0018 memory: 20334 grad_norm: 0.0132 loss: 0.6839 2023/06/01 01:28:25 - mmengine - INFO - Epoch(train) [11][ 600/5758] lr: 5.0500e-04 eta: 6:43:38 time: 0.3825 data_time: 0.0016 memory: 20334 grad_norm: 0.0110 loss: 0.6890 2023/06/01 01:29:05 - mmengine - INFO - Epoch(train) [11][ 700/5758] lr: 5.0500e-04 eta: 6:42:53 time: 0.4215 data_time: 0.0017 memory: 20334 grad_norm: 0.0168 loss: 0.6823 2023/06/01 01:29:45 - mmengine - INFO - Epoch(train) [11][ 800/5758] lr: 5.0500e-04 eta: 6:42:08 time: 0.3628 data_time: 0.0017 memory: 20334 grad_norm: 0.0158 loss: 0.6853 2023/06/01 01:30:24 - mmengine - INFO - Epoch(train) [11][ 900/5758] lr: 5.0500e-04 eta: 6:41:23 time: 0.3953 data_time: 0.0017 memory: 20334 grad_norm: 0.0144 loss: 0.6872 2023/06/01 01:31:04 - mmengine - INFO - Epoch(train) [11][1000/5758] lr: 5.0500e-04 eta: 6:40:37 time: 0.3968 data_time: 0.0016 memory: 20334 grad_norm: 0.0158 loss: 0.6875 2023/06/01 01:31:43 - mmengine - INFO - Epoch(train) [11][1100/5758] lr: 5.0500e-04 eta: 6:39:51 time: 0.3885 data_time: 0.0017 memory: 20334 grad_norm: 0.0135 loss: 0.6835 2023/06/01 01:32:23 - mmengine - INFO - Epoch(train) [11][1200/5758] lr: 5.0500e-04 eta: 6:39:07 time: 0.4766 data_time: 0.0016 memory: 20334 grad_norm: 0.0178 loss: 0.6883 2023/06/01 01:33:04 - mmengine - INFO - Epoch(train) [11][1300/5758] lr: 5.0500e-04 eta: 6:38:22 time: 0.3716 data_time: 0.0018 memory: 20334 grad_norm: 0.0212 loss: 0.6876 2023/06/01 01:33:45 - mmengine - INFO - Epoch(train) [11][1400/5758] lr: 5.0500e-04 eta: 6:37:39 time: 0.4112 data_time: 0.0020 memory: 20334 grad_norm: 0.0115 loss: 0.6881 2023/06/01 01:33:53 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 01:34:26 - mmengine - INFO - Epoch(train) [11][1500/5758] lr: 5.0500e-04 eta: 6:36:55 time: 0.4026 data_time: 0.0020 memory: 20334 grad_norm: 0.0114 loss: 0.6862 2023/06/01 01:35:05 - mmengine - INFO - Epoch(train) [11][1600/5758] lr: 5.0500e-04 eta: 6:36:09 time: 0.4091 data_time: 0.0018 memory: 20334 grad_norm: 0.0144 loss: 0.6901 2023/06/01 01:35:45 - mmengine - INFO - Epoch(train) [11][1700/5758] lr: 5.0500e-04 eta: 6:35:25 time: 0.4363 data_time: 0.0016 memory: 20334 grad_norm: 0.0201 loss: 0.6836 2023/06/01 01:36:26 - mmengine - INFO - Epoch(train) [11][1800/5758] lr: 5.0500e-04 eta: 6:34:41 time: 0.3936 data_time: 0.0017 memory: 20334 grad_norm: 0.0148 loss: 0.6905 2023/06/01 01:37:07 - mmengine - INFO - Epoch(train) [11][1900/5758] lr: 5.0500e-04 eta: 6:33:56 time: 0.4215 data_time: 0.0016 memory: 20334 grad_norm: 0.0212 loss: 0.6817 2023/06/01 01:37:46 - mmengine - INFO - Epoch(train) [11][2000/5758] lr: 5.0500e-04 eta: 6:33:11 time: 0.3910 data_time: 0.0015 memory: 20334 grad_norm: 0.0194 loss: 0.6845 2023/06/01 01:38:26 - mmengine - INFO - Epoch(train) [11][2100/5758] lr: 5.0500e-04 eta: 6:32:27 time: 0.4174 data_time: 0.0015 memory: 20334 grad_norm: 0.0126 loss: 0.6814 2023/06/01 01:39:06 - mmengine - INFO - Epoch(train) [11][2200/5758] lr: 5.0500e-04 eta: 6:31:42 time: 0.3951 data_time: 0.0015 memory: 20334 grad_norm: 0.0177 loss: 0.6864 2023/06/01 01:39:46 - mmengine - INFO - Epoch(train) [11][2300/5758] lr: 5.0500e-04 eta: 6:30:57 time: 0.4045 data_time: 0.0015 memory: 20334 grad_norm: 0.0183 loss: 0.6857 2023/06/01 01:40:27 - mmengine - INFO - Epoch(train) [11][2400/5758] lr: 5.0500e-04 eta: 6:30:13 time: 0.4253 data_time: 0.0013 memory: 20334 grad_norm: 0.0133 loss: 0.6857 2023/06/01 01:40:36 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 01:41:08 - mmengine - INFO - Epoch(train) [11][2500/5758] lr: 5.0500e-04 eta: 6:29:29 time: 0.3719 data_time: 0.0014 memory: 20334 grad_norm: 0.0134 loss: 0.6874 2023/06/01 01:41:48 - mmengine - INFO - Epoch(train) [11][2600/5758] lr: 5.0500e-04 eta: 6:28:45 time: 0.4071 data_time: 0.0013 memory: 20334 grad_norm: 0.0163 loss: 0.6870 2023/06/01 01:42:29 - mmengine - INFO - Epoch(train) [11][2700/5758] lr: 5.0500e-04 eta: 6:28:01 time: 0.4196 data_time: 0.0014 memory: 20334 grad_norm: 0.0130 loss: 0.6926 2023/06/01 01:43:09 - mmengine - INFO - Epoch(train) [11][2800/5758] lr: 5.0500e-04 eta: 6:27:16 time: 0.4231 data_time: 0.0015 memory: 20334 grad_norm: 0.0163 loss: 0.6841 2023/06/01 01:43:48 - mmengine - INFO - Epoch(train) [11][2900/5758] lr: 5.0500e-04 eta: 6:26:31 time: 0.4027 data_time: 0.0013 memory: 20334 grad_norm: 0.0160 loss: 0.6814 2023/06/01 01:44:28 - mmengine - INFO - Epoch(train) [11][3000/5758] lr: 5.0500e-04 eta: 6:25:47 time: 0.3883 data_time: 0.0014 memory: 20334 grad_norm: 0.0186 loss: 0.6886 2023/06/01 01:45:09 - mmengine - INFO - Epoch(train) [11][3100/5758] lr: 5.0500e-04 eta: 6:25:02 time: 0.4226 data_time: 0.0019 memory: 20334 grad_norm: 0.0186 loss: 0.6887 2023/06/01 01:45:49 - mmengine - INFO - Epoch(train) [11][3200/5758] lr: 5.0500e-04 eta: 6:24:18 time: 0.3969 data_time: 0.0016 memory: 20334 grad_norm: 0.0239 loss: 0.6888 2023/06/01 01:46:29 - mmengine - INFO - Epoch(train) [11][3300/5758] lr: 5.0500e-04 eta: 6:23:33 time: 0.3683 data_time: 0.0017 memory: 20334 grad_norm: 0.0195 loss: 0.6818 2023/06/01 01:47:10 - mmengine - INFO - Epoch(train) [11][3400/5758] lr: 5.0500e-04 eta: 6:22:49 time: 0.4013 data_time: 0.0015 memory: 20334 grad_norm: 0.0218 loss: 0.6876 2023/06/01 01:47:18 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 01:47:49 - mmengine - INFO - Epoch(train) [11][3500/5758] lr: 5.0500e-04 eta: 6:22:05 time: 0.3984 data_time: 0.0017 memory: 20334 grad_norm: 0.0136 loss: 0.6868 2023/06/01 01:48:32 - mmengine - INFO - Epoch(train) [11][3600/5758] lr: 5.0500e-04 eta: 6:21:22 time: 0.4510 data_time: 0.0019 memory: 20334 grad_norm: 0.0144 loss: 0.6916 2023/06/01 01:49:13 - mmengine - INFO - Epoch(train) [11][3700/5758] lr: 5.0500e-04 eta: 6:20:39 time: 0.4095 data_time: 0.0019 memory: 20334 grad_norm: 0.0127 loss: 0.6799 2023/06/01 01:49:53 - mmengine - INFO - Epoch(train) [11][3800/5758] lr: 5.0500e-04 eta: 6:19:55 time: 0.4010 data_time: 0.0017 memory: 20334 grad_norm: 0.0263 loss: 0.6835 2023/06/01 01:50:34 - mmengine - INFO - Epoch(train) [11][3900/5758] lr: 5.0500e-04 eta: 6:19:11 time: 0.4133 data_time: 0.0018 memory: 20334 grad_norm: 0.0257 loss: 0.6871 2023/06/01 01:51:14 - mmengine - INFO - Epoch(train) [11][4000/5758] lr: 5.0500e-04 eta: 6:18:27 time: 0.3668 data_time: 0.0019 memory: 20334 grad_norm: 0.0124 loss: 0.6869 2023/06/01 01:51:56 - mmengine - INFO - Epoch(train) [11][4100/5758] lr: 5.0500e-04 eta: 6:17:44 time: 0.4262 data_time: 0.0015 memory: 20334 grad_norm: 0.0131 loss: 0.6843 2023/06/01 01:52:36 - mmengine - INFO - Epoch(train) [11][4200/5758] lr: 5.0500e-04 eta: 6:16:59 time: 0.4180 data_time: 0.0018 memory: 20334 grad_norm: 0.0095 loss: 0.6876 2023/06/01 01:53:17 - mmengine - INFO - Epoch(train) [11][4300/5758] lr: 5.0500e-04 eta: 6:16:16 time: 0.4135 data_time: 0.0015 memory: 20334 grad_norm: 0.0148 loss: 0.6903 2023/06/01 01:53:58 - mmengine - INFO - Epoch(train) [11][4400/5758] lr: 5.0500e-04 eta: 6:15:32 time: 0.4193 data_time: 0.0018 memory: 20334 grad_norm: 0.0247 loss: 0.6850 2023/06/01 01:54:06 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 01:54:39 - mmengine - INFO - Epoch(train) [11][4500/5758] lr: 5.0500e-04 eta: 6:14:48 time: 0.3970 data_time: 0.0018 memory: 20334 grad_norm: 0.0213 loss: 0.6857 2023/06/01 01:55:19 - mmengine - INFO - Epoch(train) [11][4600/5758] lr: 5.0500e-04 eta: 6:14:04 time: 0.4347 data_time: 0.0017 memory: 20334 grad_norm: 0.0156 loss: 0.6863 2023/06/01 01:55:58 - mmengine - INFO - Epoch(train) [11][4700/5758] lr: 5.0500e-04 eta: 6:13:19 time: 0.3953 data_time: 0.0018 memory: 20334 grad_norm: 0.0190 loss: 0.6859 2023/06/01 01:56:39 - mmengine - INFO - Epoch(train) [11][4800/5758] lr: 5.0500e-04 eta: 6:12:35 time: 0.4231 data_time: 0.0019 memory: 20334 grad_norm: 0.0218 loss: 0.6859 2023/06/01 01:57:20 - mmengine - INFO - Epoch(train) [11][4900/5758] lr: 5.0500e-04 eta: 6:11:52 time: 0.3895 data_time: 0.0018 memory: 20334 grad_norm: 0.0144 loss: 0.6904 2023/06/01 01:58:00 - mmengine - INFO - Epoch(train) [11][5000/5758] lr: 5.0500e-04 eta: 6:11:07 time: 0.3818 data_time: 0.0027 memory: 20334 grad_norm: 0.0133 loss: 0.6810 2023/06/01 01:58:40 - mmengine - INFO - Epoch(train) [11][5100/5758] lr: 5.0500e-04 eta: 6:10:24 time: 0.3831 data_time: 0.0017 memory: 20334 grad_norm: 0.0147 loss: 0.6874 2023/06/01 01:59:22 - mmengine - INFO - Epoch(train) [11][5200/5758] lr: 5.0500e-04 eta: 6:09:41 time: 0.3923 data_time: 0.0023 memory: 20334 grad_norm: 0.0130 loss: 0.6864 2023/06/01 02:00:03 - mmengine - INFO - Epoch(train) [11][5300/5758] lr: 5.0500e-04 eta: 6:08:57 time: 0.4071 data_time: 0.0016 memory: 20334 grad_norm: 0.0164 loss: 0.6879 2023/06/01 02:00:43 - mmengine - INFO - Epoch(train) [11][5400/5758] lr: 5.0500e-04 eta: 6:08:13 time: 0.3822 data_time: 0.0017 memory: 20334 grad_norm: 0.0173 loss: 0.6896 2023/06/01 02:00:51 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 02:01:23 - mmengine - INFO - Epoch(train) [11][5500/5758] lr: 5.0500e-04 eta: 6:07:29 time: 0.4003 data_time: 0.0018 memory: 20334 grad_norm: 0.0177 loss: 0.6919 2023/06/01 02:02:04 - mmengine - INFO - Epoch(train) [11][5600/5758] lr: 5.0500e-04 eta: 6:06:45 time: 0.3750 data_time: 0.0019 memory: 20334 grad_norm: 0.0145 loss: 0.6873 2023/06/01 02:02:45 - mmengine - INFO - Epoch(train) [11][5700/5758] lr: 5.0500e-04 eta: 6:06:01 time: 0.4209 data_time: 0.0016 memory: 20334 grad_norm: 0.0182 loss: 0.6873 2023/06/01 02:03:08 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 02:03:08 - mmengine - INFO - Saving checkpoint at 11 epochs 2023/06/01 02:03:25 - mmengine - INFO - Epoch(val) [11][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.3622 time: 0.9878 2023/06/01 02:04:11 - mmengine - INFO - Epoch(train) [12][ 100/5758] lr: 4.2756e-04 eta: 6:04:56 time: 0.4174 data_time: 0.0016 memory: 20334 grad_norm: 0.0167 loss: 0.6899 2023/06/01 02:04:52 - mmengine - INFO - Epoch(train) [12][ 200/5758] lr: 4.2756e-04 eta: 6:04:13 time: 0.3964 data_time: 0.0016 memory: 20334 grad_norm: 0.0208 loss: 0.6836 2023/06/01 02:05:32 - mmengine - INFO - Epoch(train) [12][ 300/5758] lr: 4.2756e-04 eta: 6:03:28 time: 0.3718 data_time: 0.0018 memory: 20334 grad_norm: 0.0154 loss: 0.6921 2023/06/01 02:06:12 - mmengine - INFO - Epoch(train) [12][ 400/5758] lr: 4.2756e-04 eta: 6:02:44 time: 0.3979 data_time: 0.0017 memory: 20334 grad_norm: 0.0131 loss: 0.6874 2023/06/01 02:06:53 - mmengine - INFO - Epoch(train) [12][ 500/5758] lr: 4.2756e-04 eta: 6:02:01 time: 0.3903 data_time: 0.0015 memory: 20334 grad_norm: 0.0158 loss: 0.6895 2023/06/01 02:07:34 - mmengine - INFO - Epoch(train) [12][ 600/5758] lr: 4.2756e-04 eta: 6:01:18 time: 0.3752 data_time: 0.0017 memory: 20334 grad_norm: 0.0149 loss: 0.6878 2023/06/01 02:08:00 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 02:08:15 - mmengine - INFO - Epoch(train) [12][ 700/5758] lr: 4.2756e-04 eta: 6:00:34 time: 0.4368 data_time: 0.0018 memory: 20334 grad_norm: 0.0146 loss: 0.6890 2023/06/01 02:08:56 - mmengine - INFO - Epoch(train) [12][ 800/5758] lr: 4.2756e-04 eta: 5:59:51 time: 0.4049 data_time: 0.0017 memory: 20334 grad_norm: 0.0140 loss: 0.6842 2023/06/01 02:09:37 - mmengine - INFO - Epoch(train) [12][ 900/5758] lr: 4.2756e-04 eta: 5:59:07 time: 0.3810 data_time: 0.0029 memory: 20334 grad_norm: 0.0170 loss: 0.6824 2023/06/01 02:10:17 - mmengine - INFO - Epoch(train) [12][1000/5758] lr: 4.2756e-04 eta: 5:58:23 time: 0.3715 data_time: 0.0018 memory: 20334 grad_norm: 0.0112 loss: 0.6865 2023/06/01 02:10:58 - mmengine - INFO - Epoch(train) [12][1100/5758] lr: 4.2756e-04 eta: 5:57:40 time: 0.4021 data_time: 0.0017 memory: 20334 grad_norm: 0.0156 loss: 0.6854 2023/06/01 02:11:41 - mmengine - INFO - Epoch(train) [12][1200/5758] lr: 4.2756e-04 eta: 5:56:58 time: 0.4122 data_time: 0.0023 memory: 20334 grad_norm: 0.0186 loss: 0.6856 2023/06/01 02:12:23 - mmengine - INFO - Epoch(train) [12][1300/5758] lr: 4.2756e-04 eta: 5:56:15 time: 0.4182 data_time: 0.0014 memory: 20334 grad_norm: 0.0197 loss: 0.6861 2023/06/01 02:13:04 - mmengine - INFO - Epoch(train) [12][1400/5758] lr: 4.2756e-04 eta: 5:55:32 time: 0.3827 data_time: 0.0015 memory: 20334 grad_norm: 0.0152 loss: 0.6856 2023/06/01 02:13:45 - mmengine - INFO - Epoch(train) [12][1500/5758] lr: 4.2756e-04 eta: 5:54:49 time: 0.4138 data_time: 0.0015 memory: 20334 grad_norm: 0.0193 loss: 0.6876 2023/06/01 02:14:25 - mmengine - INFO - Epoch(train) [12][1600/5758] lr: 4.2756e-04 eta: 5:54:05 time: 0.3940 data_time: 0.0020 memory: 20334 grad_norm: 0.0134 loss: 0.6887 2023/06/01 02:14:51 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 02:15:07 - mmengine - INFO - Epoch(train) [12][1700/5758] lr: 4.2756e-04 eta: 5:53:22 time: 0.4305 data_time: 0.0019 memory: 20334 grad_norm: 0.0235 loss: 0.6858 2023/06/01 02:15:47 - mmengine - INFO - Epoch(train) [12][1800/5758] lr: 4.2756e-04 eta: 5:52:38 time: 0.4081 data_time: 0.0018 memory: 20334 grad_norm: 0.0164 loss: 0.6866 2023/06/01 02:16:28 - mmengine - INFO - Epoch(train) [12][1900/5758] lr: 4.2756e-04 eta: 5:51:55 time: 0.4181 data_time: 0.0020 memory: 20334 grad_norm: 0.0173 loss: 0.6868 2023/06/01 02:17:10 - mmengine - INFO - Epoch(train) [12][2000/5758] lr: 4.2756e-04 eta: 5:51:12 time: 0.4066 data_time: 0.0016 memory: 20334 grad_norm: 0.0169 loss: 0.6861 2023/06/01 02:17:51 - mmengine - INFO - Epoch(train) [12][2100/5758] lr: 4.2756e-04 eta: 5:50:29 time: 0.3917 data_time: 0.0016 memory: 20334 grad_norm: 0.0211 loss: 0.6913 2023/06/01 02:18:32 - mmengine - INFO - Epoch(train) [12][2200/5758] lr: 4.2756e-04 eta: 5:49:45 time: 0.4172 data_time: 0.0017 memory: 20334 grad_norm: 0.0162 loss: 0.6850 2023/06/01 02:19:12 - mmengine - INFO - Epoch(train) [12][2300/5758] lr: 4.2756e-04 eta: 5:49:02 time: 0.3820 data_time: 0.0019 memory: 20334 grad_norm: 0.0124 loss: 0.6903 2023/06/01 02:19:53 - mmengine - INFO - Epoch(train) [12][2400/5758] lr: 4.2756e-04 eta: 5:48:19 time: 0.3993 data_time: 0.0019 memory: 20334 grad_norm: 0.0221 loss: 0.6847 2023/06/01 02:20:34 - mmengine - INFO - Epoch(train) [12][2500/5758] lr: 4.2756e-04 eta: 5:47:35 time: 0.4088 data_time: 0.0020 memory: 20334 grad_norm: 0.0126 loss: 0.6928 2023/06/01 02:21:14 - mmengine - INFO - Epoch(train) [12][2600/5758] lr: 4.2756e-04 eta: 5:46:51 time: 0.3915 data_time: 0.0018 memory: 20334 grad_norm: 0.0179 loss: 0.6878 2023/06/01 02:21:39 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 02:21:55 - mmengine - INFO - Epoch(train) [12][2700/5758] lr: 4.2756e-04 eta: 5:46:08 time: 0.3912 data_time: 0.0021 memory: 20334 grad_norm: 0.0172 loss: 0.6884 2023/06/01 02:22:37 - mmengine - INFO - Epoch(train) [12][2800/5758] lr: 4.2756e-04 eta: 5:45:25 time: 0.4016 data_time: 0.0018 memory: 20334 grad_norm: 0.0187 loss: 0.6887 2023/06/01 02:23:18 - mmengine - INFO - Epoch(train) [12][2900/5758] lr: 4.2756e-04 eta: 5:44:42 time: 0.3667 data_time: 0.0017 memory: 20334 grad_norm: 0.0153 loss: 0.6836 2023/06/01 02:23:59 - mmengine - INFO - Epoch(train) [12][3000/5758] lr: 4.2756e-04 eta: 5:43:58 time: 0.4098 data_time: 0.0017 memory: 20334 grad_norm: 0.0124 loss: 0.6891 2023/06/01 02:24:38 - mmengine - INFO - Epoch(train) [12][3100/5758] lr: 4.2756e-04 eta: 5:43:14 time: 0.4134 data_time: 0.0026 memory: 20334 grad_norm: 0.0183 loss: 0.6834 2023/06/01 02:25:19 - mmengine - INFO - Epoch(train) [12][3200/5758] lr: 4.2756e-04 eta: 5:42:31 time: 0.4088 data_time: 0.0018 memory: 20334 grad_norm: 0.0214 loss: 0.6866 2023/06/01 02:26:00 - mmengine - INFO - Epoch(train) [12][3300/5758] lr: 4.2756e-04 eta: 5:41:48 time: 0.4331 data_time: 0.0018 memory: 20334 grad_norm: 0.0172 loss: 0.6874 2023/06/01 02:26:41 - mmengine - INFO - Epoch(train) [12][3400/5758] lr: 4.2756e-04 eta: 5:41:05 time: 0.4540 data_time: 0.0016 memory: 20334 grad_norm: 0.0108 loss: 0.6891 2023/06/01 02:27:23 - mmengine - INFO - Epoch(train) [12][3500/5758] lr: 4.2756e-04 eta: 5:40:22 time: 0.4149 data_time: 0.0018 memory: 20334 grad_norm: 0.0150 loss: 0.6788 2023/06/01 02:28:03 - mmengine - INFO - Epoch(train) [12][3600/5758] lr: 4.2756e-04 eta: 5:39:38 time: 0.4236 data_time: 0.0017 memory: 20334 grad_norm: 0.0179 loss: 0.6834 2023/06/01 02:28:30 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 02:28:46 - mmengine - INFO - Epoch(train) [12][3700/5758] lr: 4.2756e-04 eta: 5:38:56 time: 0.4062 data_time: 0.0017 memory: 20334 grad_norm: 0.0107 loss: 0.6892 2023/06/01 02:29:26 - mmengine - INFO - Epoch(train) [12][3800/5758] lr: 4.2756e-04 eta: 5:38:13 time: 0.4004 data_time: 0.0018 memory: 20334 grad_norm: 0.0169 loss: 0.6833 2023/06/01 02:30:07 - mmengine - INFO - Epoch(train) [12][3900/5758] lr: 4.2756e-04 eta: 5:37:29 time: 0.4242 data_time: 0.0019 memory: 20334 grad_norm: 0.0192 loss: 0.6863 2023/06/01 02:30:48 - mmengine - INFO - Epoch(train) [12][4000/5758] lr: 4.2756e-04 eta: 5:36:46 time: 0.4238 data_time: 0.0017 memory: 20334 grad_norm: 0.0180 loss: 0.6898 2023/06/01 02:31:30 - mmengine - INFO - Epoch(train) [12][4100/5758] lr: 4.2756e-04 eta: 5:36:04 time: 0.3922 data_time: 0.0018 memory: 20334 grad_norm: 0.0174 loss: 0.6843 2023/06/01 02:32:11 - mmengine - INFO - Epoch(train) [12][4200/5758] lr: 4.2756e-04 eta: 5:35:21 time: 0.4629 data_time: 0.0018 memory: 20334 grad_norm: 0.0163 loss: 0.6855 2023/06/01 02:32:52 - mmengine - INFO - Epoch(train) [12][4300/5758] lr: 4.2756e-04 eta: 5:34:38 time: 0.4016 data_time: 0.0020 memory: 20334 grad_norm: 0.0091 loss: 0.6874 2023/06/01 02:33:34 - mmengine - INFO - Epoch(train) [12][4400/5758] lr: 4.2756e-04 eta: 5:33:55 time: 0.3769 data_time: 0.0018 memory: 20334 grad_norm: 0.0186 loss: 0.6868 2023/06/01 02:34:14 - mmengine - INFO - Epoch(train) [12][4500/5758] lr: 4.2756e-04 eta: 5:33:11 time: 0.4596 data_time: 0.0017 memory: 20334 grad_norm: 0.0145 loss: 0.6851 2023/06/01 02:34:54 - mmengine - INFO - Epoch(train) [12][4600/5758] lr: 4.2756e-04 eta: 5:32:27 time: 0.3866 data_time: 0.0019 memory: 20334 grad_norm: 0.0203 loss: 0.6899 2023/06/01 02:35:19 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 02:35:34 - mmengine - INFO - Epoch(train) [12][4700/5758] lr: 4.2756e-04 eta: 5:31:44 time: 0.3940 data_time: 0.0017 memory: 20334 grad_norm: 0.0242 loss: 0.6864 2023/06/01 02:36:15 - mmengine - INFO - Epoch(train) [12][4800/5758] lr: 4.2756e-04 eta: 5:31:01 time: 0.4072 data_time: 0.0018 memory: 20334 grad_norm: 0.0172 loss: 0.6876 2023/06/01 02:36:57 - mmengine - INFO - Epoch(train) [12][4900/5758] lr: 4.2756e-04 eta: 5:30:18 time: 0.4369 data_time: 0.0021 memory: 20334 grad_norm: 0.0201 loss: 0.6837 2023/06/01 02:37:37 - mmengine - INFO - Epoch(train) [12][5000/5758] lr: 4.2756e-04 eta: 5:29:35 time: 0.4183 data_time: 0.0017 memory: 20334 grad_norm: 0.0112 loss: 0.6864 2023/06/01 02:38:19 - mmengine - INFO - Epoch(train) [12][5100/5758] lr: 4.2756e-04 eta: 5:28:52 time: 0.4049 data_time: 0.0019 memory: 20334 grad_norm: 0.0135 loss: 0.6894 2023/06/01 02:38:59 - mmengine - INFO - Epoch(train) [12][5200/5758] lr: 4.2756e-04 eta: 5:28:09 time: 0.3898 data_time: 0.0018 memory: 20334 grad_norm: 0.0130 loss: 0.6878 2023/06/01 02:39:39 - mmengine - INFO - Epoch(train) [12][5300/5758] lr: 4.2756e-04 eta: 5:27:25 time: 0.3955 data_time: 0.0018 memory: 20334 grad_norm: 0.0145 loss: 0.6886 2023/06/01 02:40:20 - mmengine - INFO - Epoch(train) [12][5400/5758] lr: 4.2756e-04 eta: 5:26:41 time: 0.3866 data_time: 0.0016 memory: 20334 grad_norm: 0.0206 loss: 0.6910 2023/06/01 02:41:00 - mmengine - INFO - Epoch(train) [12][5500/5758] lr: 4.2756e-04 eta: 5:25:58 time: 0.4252 data_time: 0.0020 memory: 20334 grad_norm: 0.0150 loss: 0.6893 2023/06/01 02:41:41 - mmengine - INFO - Epoch(train) [12][5600/5758] lr: 4.2756e-04 eta: 5:25:15 time: 0.4106 data_time: 0.0020 memory: 20334 grad_norm: 0.0154 loss: 0.6877 2023/06/01 02:42:07 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 02:42:21 - mmengine - INFO - Epoch(train) [12][5700/5758] lr: 4.2756e-04 eta: 5:24:31 time: 0.3690 data_time: 0.0016 memory: 20334 grad_norm: 0.0150 loss: 0.6824 2023/06/01 02:42:45 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 02:42:45 - mmengine - INFO - Saving checkpoint at 12 epochs 2023/06/01 02:43:03 - mmengine - INFO - Epoch(val) [12][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.3588 time: 0.9851 2023/06/01 02:43:49 - mmengine - INFO - Epoch(train) [13][ 100/5758] lr: 3.5204e-04 eta: 5:23:26 time: 0.4269 data_time: 0.0025 memory: 20334 grad_norm: 0.0141 loss: 0.6865 2023/06/01 02:44:30 - mmengine - INFO - Epoch(train) [13][ 200/5758] lr: 3.5204e-04 eta: 5:22:43 time: 0.4002 data_time: 0.0017 memory: 20334 grad_norm: 0.0101 loss: 0.6878 2023/06/01 02:45:10 - mmengine - INFO - Epoch(train) [13][ 300/5758] lr: 3.5204e-04 eta: 5:21:59 time: 0.4012 data_time: 0.0014 memory: 20334 grad_norm: 0.0100 loss: 0.6862 2023/06/01 02:45:49 - mmengine - INFO - Epoch(train) [13][ 400/5758] lr: 3.5204e-04 eta: 5:21:15 time: 0.3791 data_time: 0.0024 memory: 20334 grad_norm: 0.0128 loss: 0.6865 2023/06/01 02:46:30 - mmengine - INFO - Epoch(train) [13][ 500/5758] lr: 3.5204e-04 eta: 5:20:32 time: 0.4206 data_time: 0.0020 memory: 20334 grad_norm: 0.0171 loss: 0.6865 2023/06/01 02:47:11 - mmengine - INFO - Epoch(train) [13][ 600/5758] lr: 3.5204e-04 eta: 5:19:49 time: 0.4063 data_time: 0.0023 memory: 20334 grad_norm: 0.0095 loss: 0.6879 2023/06/01 02:47:51 - mmengine - INFO - Epoch(train) [13][ 700/5758] lr: 3.5204e-04 eta: 5:19:05 time: 0.4161 data_time: 0.0020 memory: 20334 grad_norm: 0.0248 loss: 0.6804 2023/06/01 02:48:32 - mmengine - INFO - Epoch(train) [13][ 800/5758] lr: 3.5204e-04 eta: 5:18:22 time: 0.3798 data_time: 0.0020 memory: 20334 grad_norm: 0.0139 loss: 0.6857 2023/06/01 02:49:14 - mmengine - INFO - Epoch(train) [13][ 900/5758] lr: 3.5204e-04 eta: 5:17:40 time: 0.4143 data_time: 0.0019 memory: 20334 grad_norm: 0.0180 loss: 0.6901 2023/06/01 02:49:15 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 02:49:53 - mmengine - INFO - Epoch(train) [13][1000/5758] lr: 3.5204e-04 eta: 5:16:56 time: 0.3758 data_time: 0.0020 memory: 20334 grad_norm: 0.0149 loss: 0.6854 2023/06/01 02:50:34 - mmengine - INFO - Epoch(train) [13][1100/5758] lr: 3.5204e-04 eta: 5:16:13 time: 0.4092 data_time: 0.0018 memory: 20334 grad_norm: 0.0154 loss: 0.6901 2023/06/01 02:51:15 - mmengine - INFO - Epoch(train) [13][1200/5758] lr: 3.5204e-04 eta: 5:15:30 time: 0.4044 data_time: 0.0018 memory: 20334 grad_norm: 0.0099 loss: 0.6862 2023/06/01 02:51:56 - mmengine - INFO - Epoch(train) [13][1300/5758] lr: 3.5204e-04 eta: 5:14:47 time: 0.3941 data_time: 0.0017 memory: 20334 grad_norm: 0.0154 loss: 0.6870 2023/06/01 02:52:37 - mmengine - INFO - Epoch(train) [13][1400/5758] lr: 3.5204e-04 eta: 5:14:04 time: 0.3964 data_time: 0.0024 memory: 20334 grad_norm: 0.0230 loss: 0.6842 2023/06/01 02:53:17 - mmengine - INFO - Epoch(train) [13][1500/5758] lr: 3.5204e-04 eta: 5:13:21 time: 0.3983 data_time: 0.0016 memory: 20334 grad_norm: 0.0152 loss: 0.6912 2023/06/01 02:53:58 - mmengine - INFO - Epoch(train) [13][1600/5758] lr: 3.5204e-04 eta: 5:12:38 time: 0.3649 data_time: 0.0017 memory: 20334 grad_norm: 0.0111 loss: 0.6871 2023/06/01 02:54:40 - mmengine - INFO - Epoch(train) [13][1700/5758] lr: 3.5204e-04 eta: 5:11:55 time: 0.3884 data_time: 0.0018 memory: 20334 grad_norm: 0.0183 loss: 0.6873 2023/06/01 02:55:21 - mmengine - INFO - Epoch(train) [13][1800/5758] lr: 3.5204e-04 eta: 5:11:12 time: 0.4034 data_time: 0.0017 memory: 20334 grad_norm: 0.0086 loss: 0.6863 2023/06/01 02:56:03 - mmengine - INFO - Epoch(train) [13][1900/5758] lr: 3.5204e-04 eta: 5:10:30 time: 0.3939 data_time: 0.0017 memory: 20334 grad_norm: 0.0165 loss: 0.6863 2023/06/01 02:56:04 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 02:56:44 - mmengine - INFO - Epoch(train) [13][2000/5758] lr: 3.5204e-04 eta: 5:09:47 time: 0.4203 data_time: 0.0017 memory: 20334 grad_norm: 0.0131 loss: 0.6839 2023/06/01 02:57:25 - mmengine - INFO - Epoch(train) [13][2100/5758] lr: 3.5204e-04 eta: 5:09:04 time: 0.4262 data_time: 0.0021 memory: 20334 grad_norm: 0.0209 loss: 0.6896 2023/06/01 02:58:06 - mmengine - INFO - Epoch(train) [13][2200/5758] lr: 3.5204e-04 eta: 5:08:21 time: 0.3909 data_time: 0.0019 memory: 20334 grad_norm: 0.0167 loss: 0.6838 2023/06/01 02:58:46 - mmengine - INFO - Epoch(train) [13][2300/5758] lr: 3.5204e-04 eta: 5:07:38 time: 0.3971 data_time: 0.0020 memory: 20334 grad_norm: 0.0200 loss: 0.6866 2023/06/01 02:59:27 - mmengine - INFO - Epoch(train) [13][2400/5758] lr: 3.5204e-04 eta: 5:06:54 time: 0.4094 data_time: 0.0019 memory: 20334 grad_norm: 0.0117 loss: 0.6851 2023/06/01 03:00:08 - mmengine - INFO - Epoch(train) [13][2500/5758] lr: 3.5204e-04 eta: 5:06:12 time: 0.3848 data_time: 0.0025 memory: 20334 grad_norm: 0.0142 loss: 0.6870 2023/06/01 03:00:48 - mmengine - INFO - Epoch(train) [13][2600/5758] lr: 3.5204e-04 eta: 5:05:28 time: 0.4538 data_time: 0.0018 memory: 20334 grad_norm: 0.0174 loss: 0.6851 2023/06/01 03:01:28 - mmengine - INFO - Epoch(train) [13][2700/5758] lr: 3.5204e-04 eta: 5:04:45 time: 0.3970 data_time: 0.0018 memory: 20334 grad_norm: 0.0168 loss: 0.6873 2023/06/01 03:02:08 - mmengine - INFO - Epoch(train) [13][2800/5758] lr: 3.5204e-04 eta: 5:04:01 time: 0.4020 data_time: 0.0014 memory: 20334 grad_norm: 0.0135 loss: 0.6868 2023/06/01 03:02:49 - mmengine - INFO - Epoch(train) [13][2900/5758] lr: 3.5204e-04 eta: 5:03:19 time: 0.3962 data_time: 0.0013 memory: 20334 grad_norm: 0.0152 loss: 0.6831 2023/06/01 03:02:51 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 03:03:30 - mmengine - INFO - Epoch(train) [13][3000/5758] lr: 3.5204e-04 eta: 5:02:36 time: 0.4417 data_time: 0.0021 memory: 20334 grad_norm: 0.0159 loss: 0.6846 2023/06/01 03:04:11 - mmengine - INFO - Epoch(train) [13][3100/5758] lr: 3.5204e-04 eta: 5:01:53 time: 0.4251 data_time: 0.0020 memory: 20334 grad_norm: 0.0134 loss: 0.6819 2023/06/01 03:04:52 - mmengine - INFO - Epoch(train) [13][3200/5758] lr: 3.5204e-04 eta: 5:01:10 time: 0.4020 data_time: 0.0019 memory: 20334 grad_norm: 0.0189 loss: 0.6857 2023/06/01 03:05:32 - mmengine - INFO - Epoch(train) [13][3300/5758] lr: 3.5204e-04 eta: 5:00:27 time: 0.4200 data_time: 0.0016 memory: 20334 grad_norm: 0.0112 loss: 0.6902 2023/06/01 03:06:12 - mmengine - INFO - Epoch(train) [13][3400/5758] lr: 3.5204e-04 eta: 4:59:43 time: 0.3886 data_time: 0.0016 memory: 20334 grad_norm: 0.0155 loss: 0.6882 2023/06/01 03:06:52 - mmengine - INFO - Epoch(train) [13][3500/5758] lr: 3.5204e-04 eta: 4:59:00 time: 0.4287 data_time: 0.0019 memory: 20334 grad_norm: 0.0182 loss: 0.6857 2023/06/01 03:07:33 - mmengine - INFO - Epoch(train) [13][3600/5758] lr: 3.5204e-04 eta: 4:58:17 time: 0.4204 data_time: 0.0018 memory: 20334 grad_norm: 0.0161 loss: 0.6881 2023/06/01 03:08:14 - mmengine - INFO - Epoch(train) [13][3700/5758] lr: 3.5204e-04 eta: 4:57:34 time: 0.4773 data_time: 0.0016 memory: 20334 grad_norm: 0.0174 loss: 0.6847 2023/06/01 03:08:55 - mmengine - INFO - Epoch(train) [13][3800/5758] lr: 3.5204e-04 eta: 4:56:51 time: 0.4264 data_time: 0.0019 memory: 20334 grad_norm: 0.0183 loss: 0.6820 2023/06/01 03:09:36 - mmengine - INFO - Epoch(train) [13][3900/5758] lr: 3.5204e-04 eta: 4:56:08 time: 0.4156 data_time: 0.0018 memory: 20334 grad_norm: 0.0165 loss: 0.6912 2023/06/01 03:09:37 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 03:10:16 - mmengine - INFO - Epoch(train) [13][4000/5758] lr: 3.5204e-04 eta: 4:55:25 time: 0.4144 data_time: 0.0017 memory: 20334 grad_norm: 0.0192 loss: 0.6915 2023/06/01 03:10:58 - mmengine - INFO - Epoch(train) [13][4100/5758] lr: 3.5204e-04 eta: 4:54:43 time: 0.4298 data_time: 0.0016 memory: 20334 grad_norm: 0.0150 loss: 0.6840 2023/06/01 03:11:38 - mmengine - INFO - Epoch(train) [13][4200/5758] lr: 3.5204e-04 eta: 4:54:00 time: 0.4044 data_time: 0.0017 memory: 20334 grad_norm: 0.0147 loss: 0.6888 2023/06/01 03:12:19 - mmengine - INFO - Epoch(train) [13][4300/5758] lr: 3.5204e-04 eta: 4:53:16 time: 0.4214 data_time: 0.0016 memory: 20334 grad_norm: 0.0135 loss: 0.6862 2023/06/01 03:12:59 - mmengine - INFO - Epoch(train) [13][4400/5758] lr: 3.5204e-04 eta: 4:52:33 time: 0.4440 data_time: 0.0018 memory: 20334 grad_norm: 0.0154 loss: 0.6865 2023/06/01 03:13:40 - mmengine - INFO - Epoch(train) [13][4500/5758] lr: 3.5204e-04 eta: 4:51:50 time: 0.4069 data_time: 0.0016 memory: 20334 grad_norm: 0.0135 loss: 0.6880 2023/06/01 03:14:20 - mmengine - INFO - Epoch(train) [13][4600/5758] lr: 3.5204e-04 eta: 4:51:07 time: 0.3917 data_time: 0.0018 memory: 20334 grad_norm: 0.0148 loss: 0.6874 2023/06/01 03:14:59 - mmengine - INFO - Epoch(train) [13][4700/5758] lr: 3.5204e-04 eta: 4:50:23 time: 0.3876 data_time: 0.0019 memory: 20334 grad_norm: 0.0158 loss: 0.6909 2023/06/01 03:15:40 - mmengine - INFO - Epoch(train) [13][4800/5758] lr: 3.5204e-04 eta: 4:49:40 time: 0.3883 data_time: 0.0017 memory: 20334 grad_norm: 0.0157 loss: 0.6887 2023/06/01 03:16:21 - mmengine - INFO - Epoch(train) [13][4900/5758] lr: 3.5204e-04 eta: 4:48:58 time: 0.3957 data_time: 0.0019 memory: 20334 grad_norm: 0.0104 loss: 0.6860 2023/06/01 03:16:23 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 03:17:03 - mmengine - INFO - Epoch(train) [13][5000/5758] lr: 3.5204e-04 eta: 4:48:16 time: 0.4279 data_time: 0.0016 memory: 20334 grad_norm: 0.0171 loss: 0.6907 2023/06/01 03:17:44 - mmengine - INFO - Epoch(train) [13][5100/5758] lr: 3.5204e-04 eta: 4:47:33 time: 0.4068 data_time: 0.0018 memory: 20334 grad_norm: 0.0140 loss: 0.6830 2023/06/01 03:18:25 - mmengine - INFO - Epoch(train) [13][5200/5758] lr: 3.5204e-04 eta: 4:46:50 time: 0.3933 data_time: 0.0017 memory: 20334 grad_norm: 0.0163 loss: 0.6807 2023/06/01 03:19:07 - mmengine - INFO - Epoch(train) [13][5300/5758] lr: 3.5204e-04 eta: 4:46:08 time: 0.3786 data_time: 0.0017 memory: 20334 grad_norm: 0.0129 loss: 0.6895 2023/06/01 03:19:48 - mmengine - INFO - Epoch(train) [13][5400/5758] lr: 3.5204e-04 eta: 4:45:25 time: 0.3915 data_time: 0.0019 memory: 20334 grad_norm: 0.0120 loss: 0.6844 2023/06/01 03:20:28 - mmengine - INFO - Epoch(train) [13][5500/5758] lr: 3.5204e-04 eta: 4:44:42 time: 0.4118 data_time: 0.0019 memory: 20334 grad_norm: 0.0198 loss: 0.6846 2023/06/01 03:21:10 - mmengine - INFO - Epoch(train) [13][5600/5758] lr: 3.5204e-04 eta: 4:44:00 time: 0.4156 data_time: 0.0020 memory: 20334 grad_norm: 0.0192 loss: 0.6796 2023/06/01 03:21:50 - mmengine - INFO - Epoch(train) [13][5700/5758] lr: 3.5204e-04 eta: 4:43:17 time: 0.4015 data_time: 0.0025 memory: 20334 grad_norm: 0.0195 loss: 0.6879 2023/06/01 03:22:13 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 03:22:13 - mmengine - INFO - Saving checkpoint at 13 epochs 2023/06/01 03:22:30 - mmengine - INFO - Epoch(val) [13][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.3619 time: 0.9856 2023/06/01 03:23:15 - mmengine - INFO - Epoch(train) [14][ 100/5758] lr: 2.8027e-04 eta: 4:42:11 time: 0.4051 data_time: 0.0016 memory: 20334 grad_norm: 0.0117 loss: 0.6840 2023/06/01 03:23:33 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 03:23:56 - mmengine - INFO - Epoch(train) [14][ 200/5758] lr: 2.8027e-04 eta: 4:41:28 time: 0.4049 data_time: 0.0018 memory: 20334 grad_norm: 0.0189 loss: 0.6888 2023/06/01 03:24:35 - mmengine - INFO - Epoch(train) [14][ 300/5758] lr: 2.8027e-04 eta: 4:40:45 time: 0.4188 data_time: 0.0016 memory: 20334 grad_norm: 0.0146 loss: 0.6866 2023/06/01 03:25:16 - mmengine - INFO - Epoch(train) [14][ 400/5758] lr: 2.8027e-04 eta: 4:40:02 time: 0.4036 data_time: 0.0017 memory: 20334 grad_norm: 0.0139 loss: 0.6841 2023/06/01 03:25:57 - mmengine - INFO - Epoch(train) [14][ 500/5758] lr: 2.8027e-04 eta: 4:39:19 time: 0.4008 data_time: 0.0016 memory: 20334 grad_norm: 0.0176 loss: 0.6901 2023/06/01 03:26:38 - mmengine - INFO - Epoch(train) [14][ 600/5758] lr: 2.8027e-04 eta: 4:38:37 time: 0.3748 data_time: 0.0018 memory: 20334 grad_norm: 0.0164 loss: 0.6843 2023/06/01 03:27:18 - mmengine - INFO - Epoch(train) [14][ 700/5758] lr: 2.8027e-04 eta: 4:37:53 time: 0.3973 data_time: 0.0017 memory: 20334 grad_norm: 0.0147 loss: 0.6854 2023/06/01 03:27:59 - mmengine - INFO - Epoch(train) [14][ 800/5758] lr: 2.8027e-04 eta: 4:37:11 time: 0.3918 data_time: 0.0020 memory: 20334 grad_norm: 0.0120 loss: 0.6881 2023/06/01 03:28:39 - mmengine - INFO - Epoch(train) [14][ 900/5758] lr: 2.8027e-04 eta: 4:36:27 time: 0.4074 data_time: 0.0015 memory: 20334 grad_norm: 0.0124 loss: 0.6873 2023/06/01 03:29:20 - mmengine - INFO - Epoch(train) [14][1000/5758] lr: 2.8027e-04 eta: 4:35:45 time: 0.4161 data_time: 0.0017 memory: 20334 grad_norm: 0.0130 loss: 0.6828 2023/06/01 03:30:00 - mmengine - INFO - Epoch(train) [14][1100/5758] lr: 2.8027e-04 eta: 4:35:02 time: 0.4034 data_time: 0.0017 memory: 20334 grad_norm: 0.0188 loss: 0.6833 2023/06/01 03:30:20 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 03:30:42 - mmengine - INFO - Epoch(train) [14][1200/5758] lr: 2.8027e-04 eta: 4:34:19 time: 0.4017 data_time: 0.0015 memory: 20334 grad_norm: 0.0133 loss: 0.6858 2023/06/01 03:31:23 - mmengine - INFO - Epoch(train) [14][1300/5758] lr: 2.8027e-04 eta: 4:33:37 time: 0.3840 data_time: 0.0017 memory: 20334 grad_norm: 0.0218 loss: 0.6842 2023/06/01 03:32:03 - mmengine - INFO - Epoch(train) [14][1400/5758] lr: 2.8027e-04 eta: 4:32:53 time: 0.3650 data_time: 0.0025 memory: 20334 grad_norm: 0.0217 loss: 0.6864 2023/06/01 03:32:43 - mmengine - INFO - Epoch(train) [14][1500/5758] lr: 2.8027e-04 eta: 4:32:10 time: 0.3948 data_time: 0.0022 memory: 20334 grad_norm: 0.0172 loss: 0.6841 2023/06/01 03:33:23 - mmengine - INFO - Epoch(train) [14][1600/5758] lr: 2.8027e-04 eta: 4:31:27 time: 0.4120 data_time: 0.0018 memory: 20334 grad_norm: 0.0167 loss: 0.6881 2023/06/01 03:34:04 - mmengine - INFO - Epoch(train) [14][1700/5758] lr: 2.8027e-04 eta: 4:30:45 time: 0.4402 data_time: 0.0019 memory: 20334 grad_norm: 0.0161 loss: 0.6885 2023/06/01 03:34:44 - mmengine - INFO - Epoch(train) [14][1800/5758] lr: 2.8027e-04 eta: 4:30:01 time: 0.3780 data_time: 0.0028 memory: 20334 grad_norm: 0.0162 loss: 0.6849 2023/06/01 03:35:25 - mmengine - INFO - Epoch(train) [14][1900/5758] lr: 2.8027e-04 eta: 4:29:19 time: 0.4140 data_time: 0.0017 memory: 20334 grad_norm: 0.0153 loss: 0.6846 2023/06/01 03:36:05 - mmengine - INFO - Epoch(train) [14][2000/5758] lr: 2.8027e-04 eta: 4:28:36 time: 0.3821 data_time: 0.0019 memory: 20334 grad_norm: 0.0220 loss: 0.6852 2023/06/01 03:36:46 - mmengine - INFO - Epoch(train) [14][2100/5758] lr: 2.8027e-04 eta: 4:27:53 time: 0.4418 data_time: 0.0032 memory: 20334 grad_norm: 0.0167 loss: 0.6878 2023/06/01 03:37:05 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 03:37:26 - mmengine - INFO - Epoch(train) [14][2200/5758] lr: 2.8027e-04 eta: 4:27:10 time: 0.3728 data_time: 0.0019 memory: 20334 grad_norm: 0.0161 loss: 0.6832 2023/06/01 03:38:07 - mmengine - INFO - Epoch(train) [14][2300/5758] lr: 2.8027e-04 eta: 4:26:27 time: 0.3868 data_time: 0.0021 memory: 20334 grad_norm: 0.0172 loss: 0.6870 2023/06/01 03:38:47 - mmengine - INFO - Epoch(train) [14][2400/5758] lr: 2.8027e-04 eta: 4:25:45 time: 0.3791 data_time: 0.0019 memory: 20334 grad_norm: 0.0179 loss: 0.6842 2023/06/01 03:39:28 - mmengine - INFO - Epoch(train) [14][2500/5758] lr: 2.8027e-04 eta: 4:25:02 time: 0.4111 data_time: 0.0023 memory: 20334 grad_norm: 0.0167 loss: 0.6866 2023/06/01 03:40:09 - mmengine - INFO - Epoch(train) [14][2600/5758] lr: 2.8027e-04 eta: 4:24:19 time: 0.4093 data_time: 0.0028 memory: 20334 grad_norm: 0.0161 loss: 0.6886 2023/06/01 03:40:49 - mmengine - INFO - Epoch(train) [14][2700/5758] lr: 2.8027e-04 eta: 4:23:36 time: 0.3964 data_time: 0.0019 memory: 20334 grad_norm: 0.0148 loss: 0.6856 2023/06/01 03:41:30 - mmengine - INFO - Epoch(train) [14][2800/5758] lr: 2.8027e-04 eta: 4:22:54 time: 0.4002 data_time: 0.0022 memory: 20334 grad_norm: 0.0129 loss: 0.6868 2023/06/01 03:42:10 - mmengine - INFO - Epoch(train) [14][2900/5758] lr: 2.8027e-04 eta: 4:22:11 time: 0.4013 data_time: 0.0016 memory: 20334 grad_norm: 0.0194 loss: 0.6817 2023/06/01 03:42:51 - mmengine - INFO - Epoch(train) [14][3000/5758] lr: 2.8027e-04 eta: 4:21:28 time: 0.3891 data_time: 0.0030 memory: 20334 grad_norm: 0.0112 loss: 0.6808 2023/06/01 03:43:30 - mmengine - INFO - Epoch(train) [14][3100/5758] lr: 2.8027e-04 eta: 4:20:45 time: 0.3894 data_time: 0.0022 memory: 20334 grad_norm: 0.0140 loss: 0.6876 2023/06/01 03:43:50 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 03:44:12 - mmengine - INFO - Epoch(train) [14][3200/5758] lr: 2.8027e-04 eta: 4:20:02 time: 0.4051 data_time: 0.0018 memory: 20334 grad_norm: 0.0148 loss: 0.6881 2023/06/01 03:44:53 - mmengine - INFO - Epoch(train) [14][3300/5758] lr: 2.8027e-04 eta: 4:19:20 time: 0.4410 data_time: 0.0026 memory: 20334 grad_norm: 0.0166 loss: 0.6841 2023/06/01 03:45:33 - mmengine - INFO - Epoch(train) [14][3400/5758] lr: 2.8027e-04 eta: 4:18:37 time: 0.3974 data_time: 0.0017 memory: 20334 grad_norm: 0.0212 loss: 0.6881 2023/06/01 03:46:13 - mmengine - INFO - Epoch(train) [14][3500/5758] lr: 2.8027e-04 eta: 4:17:54 time: 0.4076 data_time: 0.0025 memory: 20334 grad_norm: 0.0132 loss: 0.6879 2023/06/01 03:46:53 - mmengine - INFO - Epoch(train) [14][3600/5758] lr: 2.8027e-04 eta: 4:17:11 time: 0.4075 data_time: 0.0023 memory: 20334 grad_norm: 0.0202 loss: 0.6878 2023/06/01 03:47:34 - mmengine - INFO - Epoch(train) [14][3700/5758] lr: 2.8027e-04 eta: 4:16:28 time: 0.4230 data_time: 0.0018 memory: 20334 grad_norm: 0.0168 loss: 0.6904 2023/06/01 03:48:14 - mmengine - INFO - Epoch(train) [14][3800/5758] lr: 2.8027e-04 eta: 4:15:45 time: 0.4008 data_time: 0.0020 memory: 20334 grad_norm: 0.0209 loss: 0.6921 2023/06/01 03:48:56 - mmengine - INFO - Epoch(train) [14][3900/5758] lr: 2.8027e-04 eta: 4:15:03 time: 0.4062 data_time: 0.0026 memory: 20334 grad_norm: 0.0159 loss: 0.6849 2023/06/01 03:49:37 - mmengine - INFO - Epoch(train) [14][4000/5758] lr: 2.8027e-04 eta: 4:14:20 time: 0.4196 data_time: 0.0024 memory: 20334 grad_norm: 0.0141 loss: 0.6823 2023/06/01 03:50:19 - mmengine - INFO - Epoch(train) [14][4100/5758] lr: 2.8027e-04 eta: 4:13:38 time: 0.4289 data_time: 0.0028 memory: 20334 grad_norm: 0.0155 loss: 0.6923 2023/06/01 03:50:37 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 03:51:00 - mmengine - INFO - Epoch(train) [14][4200/5758] lr: 2.8027e-04 eta: 4:12:56 time: 0.4182 data_time: 0.0023 memory: 20334 grad_norm: 0.0169 loss: 0.6882 2023/06/01 03:51:41 - mmengine - INFO - Epoch(train) [14][4300/5758] lr: 2.8027e-04 eta: 4:12:14 time: 0.4597 data_time: 0.0022 memory: 20334 grad_norm: 0.0163 loss: 0.6863 2023/06/01 03:52:21 - mmengine - INFO - Epoch(train) [14][4400/5758] lr: 2.8027e-04 eta: 4:11:30 time: 0.4341 data_time: 0.0023 memory: 20334 grad_norm: 0.0165 loss: 0.6871 2023/06/01 03:53:02 - mmengine - INFO - Epoch(train) [14][4500/5758] lr: 2.8027e-04 eta: 4:10:48 time: 0.3826 data_time: 0.0024 memory: 20334 grad_norm: 0.0174 loss: 0.6883 2023/06/01 03:53:42 - mmengine - INFO - Epoch(train) [14][4600/5758] lr: 2.8027e-04 eta: 4:10:05 time: 0.4040 data_time: 0.0028 memory: 20334 grad_norm: 0.0177 loss: 0.6873 2023/06/01 03:54:23 - mmengine - INFO - Epoch(train) [14][4700/5758] lr: 2.8027e-04 eta: 4:09:23 time: 0.4057 data_time: 0.0029 memory: 20334 grad_norm: 0.0138 loss: 0.6849 2023/06/01 03:55:02 - mmengine - INFO - Epoch(train) [14][4800/5758] lr: 2.8027e-04 eta: 4:08:39 time: 0.3935 data_time: 0.0018 memory: 20334 grad_norm: 0.0143 loss: 0.6844 2023/06/01 03:55:43 - mmengine - INFO - Epoch(train) [14][4900/5758] lr: 2.8027e-04 eta: 4:07:57 time: 0.3880 data_time: 0.0022 memory: 20334 grad_norm: 0.0140 loss: 0.6846 2023/06/01 03:56:24 - mmengine - INFO - Epoch(train) [14][5000/5758] lr: 2.8027e-04 eta: 4:07:14 time: 0.4062 data_time: 0.0018 memory: 20334 grad_norm: 0.0130 loss: 0.6860 2023/06/01 03:57:04 - mmengine - INFO - Epoch(train) [14][5100/5758] lr: 2.8027e-04 eta: 4:06:31 time: 0.4120 data_time: 0.0016 memory: 20334 grad_norm: 0.0183 loss: 0.6916 2023/06/01 03:57:23 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 03:57:44 - mmengine - INFO - Epoch(train) [14][5200/5758] lr: 2.8027e-04 eta: 4:05:49 time: 0.3808 data_time: 0.0028 memory: 20334 grad_norm: 0.0202 loss: 0.6902 2023/06/01 03:58:25 - mmengine - INFO - Epoch(train) [14][5300/5758] lr: 2.8027e-04 eta: 4:05:06 time: 0.4348 data_time: 0.0020 memory: 20334 grad_norm: 0.0183 loss: 0.6889 2023/06/01 03:59:05 - mmengine - INFO - Epoch(train) [14][5400/5758] lr: 2.8027e-04 eta: 4:04:23 time: 0.4004 data_time: 0.0017 memory: 20334 grad_norm: 0.0187 loss: 0.6876 2023/06/01 03:59:47 - mmengine - INFO - Epoch(train) [14][5500/5758] lr: 2.8027e-04 eta: 4:03:41 time: 0.4484 data_time: 0.0020 memory: 20334 grad_norm: 0.0207 loss: 0.6857 2023/06/01 04:00:27 - mmengine - INFO - Epoch(train) [14][5600/5758] lr: 2.8027e-04 eta: 4:02:58 time: 0.3978 data_time: 0.0018 memory: 20334 grad_norm: 0.0182 loss: 0.6878 2023/06/01 04:01:06 - mmengine - INFO - Epoch(train) [14][5700/5758] lr: 2.8027e-04 eta: 4:02:15 time: 0.4089 data_time: 0.0019 memory: 20334 grad_norm: 0.0149 loss: 0.6852 2023/06/01 04:01:30 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 04:01:30 - mmengine - INFO - Saving checkpoint at 14 epochs 2023/06/01 04:01:47 - mmengine - INFO - Epoch(val) [14][8/8] accuracy/top1: 100.0000 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [100.0, 0.0] single-label/f1-score_classwise: [100.0, 0.0] data_time: 0.3668 time: 0.9949 2023/06/01 04:02:32 - mmengine - INFO - Epoch(train) [15][ 100/5758] lr: 2.1405e-04 eta: 4:01:10 time: 0.3993 data_time: 0.0020 memory: 20334 grad_norm: 0.0204 loss: 0.6880 2023/06/01 04:03:13 - mmengine - INFO - Epoch(train) [15][ 200/5758] lr: 2.1405e-04 eta: 4:00:27 time: 0.4345 data_time: 0.0018 memory: 20334 grad_norm: 0.0132 loss: 0.6877 2023/06/01 04:03:54 - mmengine - INFO - Epoch(train) [15][ 300/5758] lr: 2.1405e-04 eta: 3:59:44 time: 0.4268 data_time: 0.0021 memory: 20334 grad_norm: 0.0208 loss: 0.6855 2023/06/01 04:04:30 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 04:04:34 - mmengine - INFO - Epoch(train) [15][ 400/5758] lr: 2.1405e-04 eta: 3:59:02 time: 0.4004 data_time: 0.0031 memory: 20334 grad_norm: 0.0173 loss: 0.6879 2023/06/01 04:05:16 - mmengine - INFO - Epoch(train) [15][ 500/5758] lr: 2.1405e-04 eta: 3:58:20 time: 0.3764 data_time: 0.0019 memory: 20334 grad_norm: 0.0120 loss: 0.6882 2023/06/01 04:05:56 - mmengine - INFO - Epoch(train) [15][ 600/5758] lr: 2.1405e-04 eta: 3:57:37 time: 0.3963 data_time: 0.0019 memory: 20334 grad_norm: 0.0103 loss: 0.6887 2023/06/01 04:06:37 - mmengine - INFO - Epoch(train) [15][ 700/5758] lr: 2.1405e-04 eta: 3:56:54 time: 0.4071 data_time: 0.0019 memory: 20334 grad_norm: 0.0194 loss: 0.6878 2023/06/01 04:07:18 - mmengine - INFO - Epoch(train) [15][ 800/5758] lr: 2.1405e-04 eta: 3:56:12 time: 0.4206 data_time: 0.0020 memory: 20334 grad_norm: 0.0173 loss: 0.6901 2023/06/01 04:08:01 - mmengine - INFO - Epoch(train) [15][ 900/5758] lr: 2.1405e-04 eta: 3:55:30 time: 0.3930 data_time: 0.0017 memory: 20334 grad_norm: 0.0251 loss: 0.6883 2023/06/01 04:08:41 - mmengine - INFO - Epoch(train) [15][1000/5758] lr: 2.1405e-04 eta: 3:54:48 time: 0.3753 data_time: 0.0016 memory: 20334 grad_norm: 0.0246 loss: 0.6852 2023/06/01 04:09:22 - mmengine - INFO - Epoch(train) [15][1100/5758] lr: 2.1405e-04 eta: 3:54:05 time: 0.4012 data_time: 0.0021 memory: 20334 grad_norm: 0.0125 loss: 0.6830 2023/06/01 04:10:03 - mmengine - INFO - Epoch(train) [15][1200/5758] lr: 2.1405e-04 eta: 3:53:23 time: 0.3802 data_time: 0.0017 memory: 20334 grad_norm: 0.0187 loss: 0.6915 2023/06/01 04:10:44 - mmengine - INFO - Epoch(train) [15][1300/5758] lr: 2.1405e-04 eta: 3:52:41 time: 0.4221 data_time: 0.0018 memory: 20334 grad_norm: 0.0111 loss: 0.6896 2023/06/01 04:11:20 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 04:11:25 - mmengine - INFO - Epoch(train) [15][1400/5758] lr: 2.1405e-04 eta: 3:51:58 time: 0.4112 data_time: 0.0020 memory: 20334 grad_norm: 0.0220 loss: 0.6879 2023/06/01 04:12:05 - mmengine - INFO - Epoch(train) [15][1500/5758] lr: 2.1405e-04 eta: 3:51:15 time: 0.4390 data_time: 0.0026 memory: 20334 grad_norm: 0.0219 loss: 0.6829 2023/06/01 04:12:45 - mmengine - INFO - Epoch(train) [15][1600/5758] lr: 2.1405e-04 eta: 3:50:33 time: 0.3933 data_time: 0.0015 memory: 20334 grad_norm: 0.7235 loss: 0.6865 2023/06/01 04:13:26 - mmengine - INFO - Epoch(train) [15][1700/5758] lr: 2.1405e-04 eta: 3:49:50 time: 0.3944 data_time: 0.0015 memory: 20334 grad_norm: 0.0194 loss: 0.6884 2023/06/01 04:14:08 - mmengine - INFO - Epoch(train) [15][1800/5758] lr: 2.1405e-04 eta: 3:49:08 time: 0.4042 data_time: 0.0019 memory: 20334 grad_norm: 0.0194 loss: 0.6874 2023/06/01 04:14:47 - mmengine - INFO - Epoch(train) [15][1900/5758] lr: 2.1405e-04 eta: 3:48:25 time: 0.4023 data_time: 0.0018 memory: 20334 grad_norm: 0.0235 loss: 0.6824 2023/06/01 04:15:28 - mmengine - INFO - Epoch(train) [15][2000/5758] lr: 2.1405e-04 eta: 3:47:43 time: 0.4261 data_time: 0.0028 memory: 20334 grad_norm: 0.0168 loss: 0.6863 2023/06/01 04:16:09 - mmengine - INFO - Epoch(train) [15][2100/5758] lr: 2.1405e-04 eta: 3:47:00 time: 0.4147 data_time: 0.0019 memory: 20334 grad_norm: 0.0195 loss: 0.6848 2023/06/01 04:16:49 - mmengine - INFO - Epoch(train) [15][2200/5758] lr: 2.1405e-04 eta: 3:46:17 time: 0.4664 data_time: 0.0023 memory: 20334 grad_norm: 0.0115 loss: 0.6897 2023/06/01 04:17:29 - mmengine - INFO - Epoch(train) [15][2300/5758] lr: 2.1405e-04 eta: 3:45:35 time: 0.3852 data_time: 0.0019 memory: 20334 grad_norm: 0.0186 loss: 0.6862 2023/06/01 04:18:05 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 04:18:10 - mmengine - INFO - Epoch(train) [15][2400/5758] lr: 2.1405e-04 eta: 3:44:53 time: 0.4318 data_time: 0.0021 memory: 20334 grad_norm: 0.0214 loss: 0.6847 2023/06/01 04:18:50 - mmengine - INFO - Epoch(train) [15][2500/5758] lr: 2.1405e-04 eta: 3:44:10 time: 0.3894 data_time: 0.0020 memory: 20334 grad_norm: 0.0150 loss: 0.6843 2023/06/01 04:19:31 - mmengine - INFO - Epoch(train) [15][2600/5758] lr: 2.1405e-04 eta: 3:43:27 time: 0.4034 data_time: 0.0023 memory: 20334 grad_norm: 0.0115 loss: 0.6823 2023/06/01 04:20:11 - mmengine - INFO - Epoch(train) [15][2700/5758] lr: 2.1405e-04 eta: 3:42:45 time: 0.3971 data_time: 0.0018 memory: 20334 grad_norm: 0.0240 loss: 0.6818 2023/06/01 04:20:52 - mmengine - INFO - Epoch(train) [15][2800/5758] lr: 2.1405e-04 eta: 3:42:02 time: 0.3992 data_time: 0.0021 memory: 20334 grad_norm: 0.0106 loss: 0.6865 2023/06/01 04:21:32 - mmengine - INFO - Epoch(train) [15][2900/5758] lr: 2.1405e-04 eta: 3:41:20 time: 0.3959 data_time: 0.0019 memory: 20334 grad_norm: 0.0105 loss: 0.6884 2023/06/01 04:22:14 - mmengine - INFO - Epoch(train) [15][3000/5758] lr: 2.1405e-04 eta: 3:40:38 time: 0.4292 data_time: 0.0017 memory: 20334 grad_norm: 0.0187 loss: 0.6831 2023/06/01 04:22:55 - mmengine - INFO - Epoch(train) [15][3100/5758] lr: 2.1405e-04 eta: 3:39:55 time: 0.4178 data_time: 0.0017 memory: 20334 grad_norm: 0.0196 loss: 0.6889 2023/06/01 04:23:35 - mmengine - INFO - Epoch(train) [15][3200/5758] lr: 2.1405e-04 eta: 3:39:13 time: 0.3990 data_time: 0.0018 memory: 20334 grad_norm: 0.0221 loss: 0.6835 2023/06/01 04:24:17 - mmengine - INFO - Epoch(train) [15][3300/5758] lr: 2.1405e-04 eta: 3:38:30 time: 0.4686 data_time: 0.0017 memory: 20334 grad_norm: 8.7150 loss: 0.6825 2023/06/01 04:24:53 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 04:24:58 - mmengine - INFO - Epoch(train) [15][3400/5758] lr: 2.1405e-04 eta: 3:37:48 time: 0.4047 data_time: 0.0016 memory: 20334 grad_norm: 0.0140 loss: 0.6852 2023/06/01 04:25:39 - mmengine - INFO - Epoch(train) [15][3500/5758] lr: 2.1405e-04 eta: 3:37:06 time: 0.4381 data_time: 0.0018 memory: 20334 grad_norm: 0.0253 loss: 0.6840 2023/06/01 04:26:19 - mmengine - INFO - Epoch(train) [15][3600/5758] lr: 2.1405e-04 eta: 3:36:23 time: 0.3851 data_time: 0.0017 memory: 20334 grad_norm: 0.0284 loss: 0.6848 2023/06/01 04:27:00 - mmengine - INFO - Epoch(train) [15][3700/5758] lr: 2.1405e-04 eta: 3:35:41 time: 0.4300 data_time: 0.0018 memory: 20334 grad_norm: 0.0278 loss: 0.6871 2023/06/01 04:27:41 - mmengine - INFO - Epoch(train) [15][3800/5758] lr: 2.1405e-04 eta: 3:34:59 time: 0.4136 data_time: 0.0021 memory: 20334 grad_norm: 0.0438 loss: 0.6851 2023/06/01 04:28:21 - mmengine - INFO - Epoch(train) [15][3900/5758] lr: 2.1405e-04 eta: 3:34:16 time: 0.3976 data_time: 0.0018 memory: 20334 grad_norm: 0.0293 loss: 0.6853 2023/06/01 04:29:03 - mmengine - INFO - Epoch(train) [15][4000/5758] lr: 2.1405e-04 eta: 3:33:34 time: 0.4272 data_time: 0.0030 memory: 20334 grad_norm: 0.0542 loss: 0.6835 2023/06/01 04:29:47 - mmengine - INFO - Epoch(train) [15][4100/5758] lr: 2.1405e-04 eta: 3:32:53 time: 0.4178 data_time: 0.0025 memory: 20334 grad_norm: 0.1979 loss: 0.6902 2023/06/01 04:30:27 - mmengine - INFO - Epoch(train) [15][4200/5758] lr: 2.1405e-04 eta: 3:32:10 time: 0.4293 data_time: 0.0020 memory: 20334 grad_norm: 0.0218 loss: 0.6847 2023/06/01 04:31:07 - mmengine - INFO - Epoch(train) [15][4300/5758] lr: 2.1405e-04 eta: 3:31:28 time: 0.3864 data_time: 0.0019 memory: 20334 grad_norm: 0.1379 loss: 0.6823 2023/06/01 04:31:42 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 04:31:47 - mmengine - INFO - Epoch(train) [15][4400/5758] lr: 2.1405e-04 eta: 3:30:45 time: 0.3929 data_time: 0.0020 memory: 20334 grad_norm: 0.0296 loss: 0.6803 2023/06/01 04:32:28 - mmengine - INFO - Epoch(train) [15][4500/5758] lr: 2.1405e-04 eta: 3:30:02 time: 0.4120 data_time: 0.0018 memory: 20334 grad_norm: 0.1280 loss: 0.6833 2023/06/01 04:33:08 - mmengine - INFO - Epoch(train) [15][4600/5758] lr: 2.1405e-04 eta: 3:29:20 time: 0.4210 data_time: 0.0020 memory: 20334 grad_norm: 0.0831 loss: 0.6848 2023/06/01 04:33:48 - mmengine - INFO - Epoch(train) [15][4700/5758] lr: 2.1405e-04 eta: 3:28:37 time: 0.4033 data_time: 0.0015 memory: 20334 grad_norm: 0.0285 loss: 0.6901 2023/06/01 04:34:30 - mmengine - INFO - Epoch(train) [15][4800/5758] lr: 2.1405e-04 eta: 3:27:55 time: 0.4446 data_time: 0.0015 memory: 20334 grad_norm: 0.0201 loss: 0.6823 2023/06/01 04:35:10 - mmengine - INFO - Epoch(train) [15][4900/5758] lr: 2.1405e-04 eta: 3:27:13 time: 0.4031 data_time: 0.0019 memory: 20334 grad_norm: 0.0154 loss: 0.6880 2023/06/01 04:35:50 - mmengine - INFO - Epoch(train) [15][5000/5758] lr: 2.1405e-04 eta: 3:26:30 time: 0.4032 data_time: 0.0016 memory: 20334 grad_norm: 8.6955 loss: 0.6833 2023/06/01 04:36:30 - mmengine - INFO - Epoch(train) [15][5100/5758] lr: 2.1405e-04 eta: 3:25:48 time: 0.3882 data_time: 0.0020 memory: 20334 grad_norm: 0.0649 loss: 0.6860 2023/06/01 04:37:11 - mmengine - INFO - Epoch(train) [15][5200/5758] lr: 2.1405e-04 eta: 3:25:05 time: 0.3962 data_time: 0.0024 memory: 20334 grad_norm: 0.0877 loss: 0.6858 2023/06/01 04:37:52 - mmengine - INFO - Epoch(train) [15][5300/5758] lr: 2.1405e-04 eta: 3:24:23 time: 0.4034 data_time: 0.0021 memory: 20334 grad_norm: 3.8150 loss: 0.6732 2023/06/01 04:38:28 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 04:38:32 - mmengine - INFO - Epoch(train) [15][5400/5758] lr: 2.1405e-04 eta: 3:23:41 time: 0.3718 data_time: 0.0022 memory: 20334 grad_norm: 2.4243 loss: 0.6876 2023/06/01 04:39:15 - mmengine - INFO - Epoch(train) [15][5500/5758] lr: 2.1405e-04 eta: 3:22:59 time: 0.4638 data_time: 0.0023 memory: 20334 grad_norm: 1.6639 loss: 0.6721 2023/06/01 04:39:57 - mmengine - INFO - Epoch(train) [15][5600/5758] lr: 2.1405e-04 eta: 3:22:17 time: 0.4014 data_time: 0.0020 memory: 20334 grad_norm: 46.6717 loss: 0.6611 2023/06/01 04:40:36 - mmengine - INFO - Epoch(train) [15][5700/5758] lr: 2.1405e-04 eta: 3:21:34 time: 0.4040 data_time: 0.0022 memory: 20334 grad_norm: 2.5040 loss: 0.6639 2023/06/01 04:40:59 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 04:40:59 - mmengine - INFO - Saving checkpoint at 15 epochs 2023/06/01 04:41:16 - mmengine - INFO - Epoch(val) [15][8/8] accuracy/top1: 93.8956 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [93.89559173583984, 0.0] single-label/f1-score_classwise: [96.8517074584961, 0.0] data_time: 0.3732 time: 0.9995 2023/06/01 04:42:01 - mmengine - INFO - Epoch(train) [16][ 100/5758] lr: 1.5498e-04 eta: 3:20:29 time: 0.4125 data_time: 0.0017 memory: 20334 grad_norm: 12.9511 loss: 0.6620 2023/06/01 04:42:42 - mmengine - INFO - Epoch(train) [16][ 200/5758] lr: 1.5498e-04 eta: 3:19:46 time: 0.4402 data_time: 0.0018 memory: 20334 grad_norm: 20.3987 loss: 0.6426 2023/06/01 04:43:22 - mmengine - INFO - Epoch(train) [16][ 300/5758] lr: 1.5498e-04 eta: 3:19:04 time: 0.3781 data_time: 0.0025 memory: 20334 grad_norm: 166.1285 loss: 0.6846 2023/06/01 04:44:03 - mmengine - INFO - Epoch(train) [16][ 400/5758] lr: 1.5498e-04 eta: 3:18:22 time: 0.4268 data_time: 0.0031 memory: 20334 grad_norm: 17.4556 loss: 0.6753 2023/06/01 04:44:43 - mmengine - INFO - Epoch(train) [16][ 500/5758] lr: 1.5498e-04 eta: 3:17:39 time: 0.4338 data_time: 0.0017 memory: 20334 grad_norm: 153.9520 loss: 0.6711 2023/06/01 04:45:23 - mmengine - INFO - Epoch(train) [16][ 600/5758] lr: 1.5498e-04 eta: 3:16:56 time: 0.4157 data_time: 0.0019 memory: 20334 grad_norm: 16.2066 loss: 0.6689 2023/06/01 04:45:35 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 04:46:05 - mmengine - INFO - Epoch(train) [16][ 700/5758] lr: 1.5498e-04 eta: 3:16:14 time: 0.4042 data_time: 0.0025 memory: 20334 grad_norm: 5.0856 loss: 0.6964 2023/06/01 04:46:45 - mmengine - INFO - Epoch(train) [16][ 800/5758] lr: 1.5498e-04 eta: 3:15:32 time: 0.4223 data_time: 0.0034 memory: 20334 grad_norm: 6.3401 loss: 0.6841 2023/06/01 04:47:27 - mmengine - INFO - Epoch(train) [16][ 900/5758] lr: 1.5498e-04 eta: 3:14:50 time: 0.3991 data_time: 0.0016 memory: 20334 grad_norm: 0.9693 loss: 0.6880 2023/06/01 04:48:07 - mmengine - INFO - Epoch(train) [16][1000/5758] lr: 1.5498e-04 eta: 3:14:08 time: 0.4078 data_time: 0.0020 memory: 20334 grad_norm: 1.2147 loss: 0.6861 2023/06/01 04:48:49 - mmengine - INFO - Epoch(train) [16][1100/5758] lr: 1.5498e-04 eta: 3:13:26 time: 0.4397 data_time: 0.0017 memory: 20334 grad_norm: 0.8336 loss: 0.6850 2023/06/01 04:49:30 - mmengine - INFO - Epoch(train) [16][1200/5758] lr: 1.5498e-04 eta: 3:12:44 time: 0.4458 data_time: 0.0019 memory: 20334 grad_norm: 1.7692 loss: 0.6863 2023/06/01 04:50:11 - mmengine - INFO - Epoch(train) [16][1300/5758] lr: 1.5498e-04 eta: 3:12:01 time: 0.4181 data_time: 0.0022 memory: 20334 grad_norm: 1.9187 loss: 0.6828 2023/06/01 04:50:51 - mmengine - INFO - Epoch(train) [16][1400/5758] lr: 1.5498e-04 eta: 3:11:19 time: 0.3825 data_time: 0.0031 memory: 20334 grad_norm: 10.5395 loss: 0.6742 2023/06/01 04:51:32 - mmengine - INFO - Epoch(train) [16][1500/5758] lr: 1.5498e-04 eta: 3:10:36 time: 0.4437 data_time: 0.0026 memory: 20334 grad_norm: 0.7334 loss: 0.6746 2023/06/01 04:52:14 - mmengine - INFO - Epoch(train) [16][1600/5758] lr: 1.5498e-04 eta: 3:09:55 time: 0.4179 data_time: 0.0022 memory: 20334 grad_norm: 0.8798 loss: 0.6776 2023/06/01 04:52:26 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 04:52:55 - mmengine - INFO - Epoch(train) [16][1700/5758] lr: 1.5498e-04 eta: 3:09:13 time: 0.4212 data_time: 0.0036 memory: 20334 grad_norm: 3.3916 loss: 0.6739 2023/06/01 04:53:34 - mmengine - INFO - Epoch(train) [16][1800/5758] lr: 1.5498e-04 eta: 3:08:30 time: 0.3852 data_time: 0.0019 memory: 20334 grad_norm: 0.2993 loss: 0.6853 2023/06/01 04:54:14 - mmengine - INFO - Epoch(train) [16][1900/5758] lr: 1.5498e-04 eta: 3:07:47 time: 0.4051 data_time: 0.0019 memory: 20334 grad_norm: 0.3990 loss: 0.6886 2023/06/01 04:54:55 - mmengine - INFO - Epoch(train) [16][2000/5758] lr: 1.5498e-04 eta: 3:07:05 time: 0.4105 data_time: 0.0028 memory: 20334 grad_norm: 0.1702 loss: 0.6884 2023/06/01 04:55:36 - mmengine - INFO - Epoch(train) [16][2100/5758] lr: 1.5498e-04 eta: 3:06:23 time: 0.4038 data_time: 0.0029 memory: 20334 grad_norm: 0.4629 loss: 0.6927 2023/06/01 04:56:17 - mmengine - INFO - Epoch(train) [16][2200/5758] lr: 1.5498e-04 eta: 3:05:41 time: 0.4490 data_time: 0.0022 memory: 20334 grad_norm: 17.8738 loss: 0.6876 2023/06/01 04:56:59 - mmengine - INFO - Epoch(train) [16][2300/5758] lr: 1.5498e-04 eta: 3:04:59 time: 0.4195 data_time: 0.0023 memory: 20334 grad_norm: 11.7259 loss: 0.6912 2023/06/01 04:57:40 - mmengine - INFO - Epoch(train) [16][2400/5758] lr: 1.5498e-04 eta: 3:04:16 time: 0.4291 data_time: 0.0018 memory: 20334 grad_norm: 1.5582 loss: 0.6845 2023/06/01 04:58:20 - mmengine - INFO - Epoch(train) [16][2500/5758] lr: 1.5498e-04 eta: 3:03:34 time: 0.3938 data_time: 0.0028 memory: 20334 grad_norm: 4.2360 loss: 0.6870 2023/06/01 04:59:01 - mmengine - INFO - Epoch(train) [16][2600/5758] lr: 1.5498e-04 eta: 3:02:52 time: 0.4125 data_time: 0.0018 memory: 20334 grad_norm: 0.7279 loss: 0.6894 2023/06/01 04:59:12 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 04:59:40 - mmengine - INFO - Epoch(train) [16][2700/5758] lr: 1.5498e-04 eta: 3:02:09 time: 0.4068 data_time: 0.0019 memory: 20334 grad_norm: 0.4202 loss: 0.6843 2023/06/01 05:00:20 - mmengine - INFO - Epoch(train) [16][2800/5758] lr: 1.5498e-04 eta: 3:01:27 time: 0.4023 data_time: 0.0018 memory: 20334 grad_norm: 0.1388 loss: 0.6857 2023/06/01 05:01:01 - mmengine - INFO - Epoch(train) [16][2900/5758] lr: 1.5498e-04 eta: 3:00:45 time: 0.4056 data_time: 0.0019 memory: 20334 grad_norm: 0.0492 loss: 0.6896 2023/06/01 05:01:42 - mmengine - INFO - Epoch(train) [16][3000/5758] lr: 1.5498e-04 eta: 3:00:02 time: 0.3845 data_time: 0.0018 memory: 20334 grad_norm: 2.3024 loss: 0.6834 2023/06/01 05:02:22 - mmengine - INFO - Epoch(train) [16][3100/5758] lr: 1.5498e-04 eta: 2:59:20 time: 0.3881 data_time: 0.0021 memory: 20334 grad_norm: 0.8119 loss: 0.6922 2023/06/01 05:03:03 - mmengine - INFO - Epoch(train) [16][3200/5758] lr: 1.5498e-04 eta: 2:58:38 time: 0.4399 data_time: 0.0018 memory: 20334 grad_norm: 0.2433 loss: 0.6882 2023/06/01 05:03:43 - mmengine - INFO - Epoch(train) [16][3300/5758] lr: 1.5498e-04 eta: 2:57:56 time: 0.4182 data_time: 0.0018 memory: 20334 grad_norm: 0.3354 loss: 0.6832 2023/06/01 05:04:24 - mmengine - INFO - Epoch(train) [16][3400/5758] lr: 1.5498e-04 eta: 2:57:13 time: 0.4243 data_time: 0.0017 memory: 20334 grad_norm: 0.2514 loss: 0.6861 2023/06/01 05:05:04 - mmengine - INFO - Epoch(train) [16][3500/5758] lr: 1.5498e-04 eta: 2:56:31 time: 0.4271 data_time: 0.0015 memory: 20334 grad_norm: 0.6776 loss: 0.6859 2023/06/01 05:05:45 - mmengine - INFO - Epoch(train) [16][3600/5758] lr: 1.5498e-04 eta: 2:55:49 time: 0.3950 data_time: 0.0016 memory: 20334 grad_norm: 0.6123 loss: 0.6826 2023/06/01 05:05:57 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 05:06:27 - mmengine - INFO - Epoch(train) [16][3700/5758] lr: 1.5498e-04 eta: 2:55:07 time: 0.4651 data_time: 0.0020 memory: 20334 grad_norm: 0.2018 loss: 0.6864 2023/06/01 05:07:08 - mmengine - INFO - Epoch(train) [16][3800/5758] lr: 1.5498e-04 eta: 2:54:25 time: 0.4244 data_time: 0.0019 memory: 20334 grad_norm: 0.2864 loss: 0.6877 2023/06/01 05:07:49 - mmengine - INFO - Epoch(train) [16][3900/5758] lr: 1.5498e-04 eta: 2:53:43 time: 0.4155 data_time: 0.0017 memory: 20334 grad_norm: 0.1539 loss: 0.6884 2023/06/01 05:08:30 - mmengine - INFO - Epoch(train) [16][4000/5758] lr: 1.5498e-04 eta: 2:53:01 time: 0.4340 data_time: 0.0017 memory: 20334 grad_norm: 0.1950 loss: 0.6865 2023/06/01 05:09:11 - mmengine - INFO - Epoch(train) [16][4100/5758] lr: 1.5498e-04 eta: 2:52:18 time: 0.4267 data_time: 0.0016 memory: 20334 grad_norm: 0.5413 loss: 0.6927 2023/06/01 05:09:51 - mmengine - INFO - Epoch(train) [16][4200/5758] lr: 1.5498e-04 eta: 2:51:36 time: 0.3973 data_time: 0.0017 memory: 20334 grad_norm: 0.7295 loss: 0.6888 2023/06/01 05:10:32 - mmengine - INFO - Epoch(train) [16][4300/5758] lr: 1.5498e-04 eta: 2:50:54 time: 0.3891 data_time: 0.0019 memory: 20334 grad_norm: 1.2996 loss: 0.6843 2023/06/01 05:11:13 - mmengine - INFO - Epoch(train) [16][4400/5758] lr: 1.5498e-04 eta: 2:50:12 time: 0.3753 data_time: 0.0016 memory: 20334 grad_norm: 0.5797 loss: 0.6847 2023/06/01 05:11:52 - mmengine - INFO - Epoch(train) [16][4500/5758] lr: 1.5498e-04 eta: 2:49:29 time: 0.3714 data_time: 0.0016 memory: 20334 grad_norm: 4.0275 loss: 0.6887 2023/06/01 05:12:34 - mmengine - INFO - Epoch(train) [16][4600/5758] lr: 1.5498e-04 eta: 2:48:47 time: 0.4044 data_time: 0.0016 memory: 20334 grad_norm: 2.0650 loss: 0.6807 2023/06/01 05:12:46 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 05:13:14 - mmengine - INFO - Epoch(train) [16][4700/5758] lr: 1.5498e-04 eta: 2:48:05 time: 0.3974 data_time: 0.0016 memory: 20334 grad_norm: 2.3120 loss: 0.6872 2023/06/01 05:13:56 - mmengine - INFO - Epoch(train) [16][4800/5758] lr: 1.5498e-04 eta: 2:47:23 time: 0.4134 data_time: 0.0016 memory: 20334 grad_norm: 6.7637 loss: 0.6780 2023/06/01 05:14:38 - mmengine - INFO - Epoch(train) [16][4900/5758] lr: 1.5498e-04 eta: 2:46:41 time: 0.4374 data_time: 0.0017 memory: 20334 grad_norm: 16.9621 loss: 0.6605 2023/06/01 05:15:18 - mmengine - INFO - Epoch(train) [16][5000/5758] lr: 1.5498e-04 eta: 2:45:59 time: 0.3812 data_time: 0.0016 memory: 20334 grad_norm: 7.3323 loss: 0.6797 2023/06/01 05:15:58 - mmengine - INFO - Epoch(train) [16][5100/5758] lr: 1.5498e-04 eta: 2:45:17 time: 0.3892 data_time: 0.0017 memory: 20334 grad_norm: 18.0871 loss: 0.6598 2023/06/01 05:16:39 - mmengine - INFO - Epoch(train) [16][5200/5758] lr: 1.5498e-04 eta: 2:44:35 time: 0.4083 data_time: 0.0015 memory: 20334 grad_norm: 422.3003 loss: 0.6643 2023/06/01 05:17:20 - mmengine - INFO - Epoch(train) [16][5300/5758] lr: 1.5498e-04 eta: 2:43:52 time: 0.3825 data_time: 0.0017 memory: 20334 grad_norm: 293.7801 loss: 0.6675 2023/06/01 05:18:00 - mmengine - INFO - Epoch(train) [16][5400/5758] lr: 1.5498e-04 eta: 2:43:10 time: 0.3898 data_time: 0.0016 memory: 20334 grad_norm: 63.2483 loss: 0.6756 2023/06/01 05:18:40 - mmengine - INFO - Epoch(train) [16][5500/5758] lr: 1.5498e-04 eta: 2:42:28 time: 0.4151 data_time: 0.0018 memory: 20334 grad_norm: 40.1539 loss: 0.6775 2023/06/01 05:19:21 - mmengine - INFO - Epoch(train) [16][5600/5758] lr: 1.5498e-04 eta: 2:41:46 time: 0.3952 data_time: 0.0020 memory: 20334 grad_norm: 77.3026 loss: 0.6732 2023/06/01 05:19:33 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 05:20:01 - mmengine - INFO - Epoch(train) [16][5700/5758] lr: 1.5498e-04 eta: 2:41:03 time: 0.3873 data_time: 0.0017 memory: 20334 grad_norm: 126.4350 loss: 0.6708 2023/06/01 05:20:25 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 05:20:25 - mmengine - INFO - Saving checkpoint at 16 epochs 2023/06/01 05:20:41 - mmengine - INFO - Epoch(val) [16][8/8] accuracy/top1: 98.0999 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [98.09986877441406, 0.0] single-label/f1-score_classwise: [99.04082489013672, 0.0] data_time: 0.3740 time: 1.0006 2023/06/01 05:21:25 - mmengine - INFO - Epoch(train) [17][ 100/5758] lr: 1.0454e-04 eta: 2:39:58 time: 0.4126 data_time: 0.0017 memory: 20334 grad_norm: 3.2546 loss: 0.6642 2023/06/01 05:22:06 - mmengine - INFO - Epoch(train) [17][ 200/5758] lr: 1.0454e-04 eta: 2:39:16 time: 0.4131 data_time: 0.0016 memory: 20334 grad_norm: 9.8711 loss: 0.6585 2023/06/01 05:22:48 - mmengine - INFO - Epoch(train) [17][ 300/5758] lr: 1.0454e-04 eta: 2:38:34 time: 0.4243 data_time: 0.0016 memory: 20334 grad_norm: 16.2878 loss: 0.6528 2023/06/01 05:23:29 - mmengine - INFO - Epoch(train) [17][ 400/5758] lr: 1.0454e-04 eta: 2:37:52 time: 0.4310 data_time: 0.0016 memory: 20334 grad_norm: 15.2030 loss: 0.6579 2023/06/01 05:24:10 - mmengine - INFO - Epoch(train) [17][ 500/5758] lr: 1.0454e-04 eta: 2:37:09 time: 0.4290 data_time: 0.0016 memory: 20334 grad_norm: 11.4554 loss: 0.6541 2023/06/01 05:24:50 - mmengine - INFO - Epoch(train) [17][ 600/5758] lr: 1.0454e-04 eta: 2:36:27 time: 0.4141 data_time: 0.0016 memory: 20334 grad_norm: 12.4674 loss: 0.6616 2023/06/01 05:25:31 - mmengine - INFO - Epoch(train) [17][ 700/5758] lr: 1.0454e-04 eta: 2:35:45 time: 0.4093 data_time: 0.0015 memory: 20334 grad_norm: 15.2415 loss: 0.6654 2023/06/01 05:26:12 - mmengine - INFO - Epoch(train) [17][ 800/5758] lr: 1.0454e-04 eta: 2:35:03 time: 0.4104 data_time: 0.0017 memory: 20334 grad_norm: 6.9286 loss: 0.6621 2023/06/01 05:26:41 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 05:26:53 - mmengine - INFO - Epoch(train) [17][ 900/5758] lr: 1.0454e-04 eta: 2:34:21 time: 0.4183 data_time: 0.0016 memory: 20334 grad_norm: 11.7854 loss: 0.6657 2023/06/01 05:27:33 - mmengine - INFO - Epoch(train) [17][1000/5758] lr: 1.0454e-04 eta: 2:33:39 time: 0.4516 data_time: 0.0019 memory: 20334 grad_norm: 19.8003 loss: 0.6556 2023/06/01 05:28:13 - mmengine - INFO - Epoch(train) [17][1100/5758] lr: 1.0454e-04 eta: 2:32:57 time: 0.3782 data_time: 0.0018 memory: 20334 grad_norm: 13.8714 loss: 0.6472 2023/06/01 05:28:54 - mmengine - INFO - Epoch(train) [17][1200/5758] lr: 1.0454e-04 eta: 2:32:15 time: 0.4360 data_time: 0.0033 memory: 20334 grad_norm: 22.1152 loss: 0.6549 2023/06/01 05:29:35 - mmengine - INFO - Epoch(train) [17][1300/5758] lr: 1.0454e-04 eta: 2:31:32 time: 0.4103 data_time: 0.0019 memory: 20334 grad_norm: 5.6001 loss: 0.6542 2023/06/01 05:30:16 - mmengine - INFO - Epoch(train) [17][1400/5758] lr: 1.0454e-04 eta: 2:30:50 time: 0.4596 data_time: 0.0024 memory: 20334 grad_norm: 11.6462 loss: 0.6577 2023/06/01 05:30:56 - mmengine - INFO - Epoch(train) [17][1500/5758] lr: 1.0454e-04 eta: 2:30:08 time: 0.3715 data_time: 0.0031 memory: 20334 grad_norm: 129.4207 loss: 0.6680 2023/06/01 05:31:36 - mmengine - INFO - Epoch(train) [17][1600/5758] lr: 1.0454e-04 eta: 2:29:26 time: 0.3841 data_time: 0.0024 memory: 20334 grad_norm: 32.7622 loss: 0.6641 2023/06/01 05:32:16 - mmengine - INFO - Epoch(train) [17][1700/5758] lr: 1.0454e-04 eta: 2:28:44 time: 0.3952 data_time: 0.0017 memory: 20334 grad_norm: 4.8323 loss: 0.6561 2023/06/01 05:32:56 - mmengine - INFO - Epoch(train) [17][1800/5758] lr: 1.0454e-04 eta: 2:28:01 time: 0.4079 data_time: 0.0021 memory: 20334 grad_norm: 3108.9795 loss: 0.6573 2023/06/01 05:33:26 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 05:33:38 - mmengine - INFO - Epoch(train) [17][1900/5758] lr: 1.0454e-04 eta: 2:27:20 time: 0.4417 data_time: 0.0018 memory: 20334 grad_norm: 7.6875 loss: 0.6517 2023/06/01 05:34:18 - mmengine - INFO - Epoch(train) [17][2000/5758] lr: 1.0454e-04 eta: 2:26:37 time: 0.4202 data_time: 0.0017 memory: 20334 grad_norm: 86.2196 loss: 0.6473 2023/06/01 05:34:59 - mmengine - INFO - Epoch(train) [17][2100/5758] lr: 1.0454e-04 eta: 2:25:55 time: 0.4056 data_time: 0.0027 memory: 20334 grad_norm: 60.8420 loss: 0.6557 2023/06/01 05:35:40 - mmengine - INFO - Epoch(train) [17][2200/5758] lr: 1.0454e-04 eta: 2:25:13 time: 0.3857 data_time: 0.0016 memory: 20334 grad_norm: 19.7107 loss: 0.6611 2023/06/01 05:36:20 - mmengine - INFO - Epoch(train) [17][2300/5758] lr: 1.0454e-04 eta: 2:24:31 time: 0.4219 data_time: 0.0028 memory: 20334 grad_norm: 8.5472 loss: 0.6529 2023/06/01 05:37:01 - mmengine - INFO - Epoch(train) [17][2400/5758] lr: 1.0454e-04 eta: 2:23:49 time: 0.4285 data_time: 0.0025 memory: 20334 grad_norm: 21.4679 loss: 0.6664 2023/06/01 05:37:41 - mmengine - INFO - Epoch(train) [17][2500/5758] lr: 1.0454e-04 eta: 2:23:07 time: 0.4013 data_time: 0.0024 memory: 20334 grad_norm: 29.7841 loss: 0.6476 2023/06/01 05:38:23 - mmengine - INFO - Epoch(train) [17][2600/5758] lr: 1.0454e-04 eta: 2:22:25 time: 0.4078 data_time: 0.0025 memory: 20334 grad_norm: 49.7052 loss: 0.6484 2023/06/01 05:39:05 - mmengine - INFO - Epoch(train) [17][2700/5758] lr: 1.0454e-04 eta: 2:21:43 time: 0.3936 data_time: 0.0023 memory: 20334 grad_norm: 8.6435 loss: 0.6472 2023/06/01 05:39:45 - mmengine - INFO - Epoch(train) [17][2800/5758] lr: 1.0454e-04 eta: 2:21:01 time: 0.4298 data_time: 0.0020 memory: 20334 grad_norm: 51.1484 loss: 0.6423 2023/06/01 05:40:14 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 05:40:25 - mmengine - INFO - Epoch(train) [17][2900/5758] lr: 1.0454e-04 eta: 2:20:19 time: 0.3986 data_time: 0.0033 memory: 20334 grad_norm: 19.7588 loss: 0.6388 2023/06/01 05:41:05 - mmengine - INFO - Epoch(train) [17][3000/5758] lr: 1.0454e-04 eta: 2:19:37 time: 0.4048 data_time: 0.0017 memory: 20334 grad_norm: 4.2124 loss: 0.6483 2023/06/01 05:41:45 - mmengine - INFO - Epoch(train) [17][3100/5758] lr: 1.0454e-04 eta: 2:18:54 time: 0.3725 data_time: 0.0019 memory: 20334 grad_norm: 3.8446 loss: 0.6351 2023/06/01 05:42:26 - mmengine - INFO - Epoch(train) [17][3200/5758] lr: 1.0454e-04 eta: 2:18:12 time: 0.4294 data_time: 0.0017 memory: 20334 grad_norm: 18.1920 loss: 0.6391 2023/06/01 05:43:06 - mmengine - INFO - Epoch(train) [17][3300/5758] lr: 1.0454e-04 eta: 2:17:30 time: 0.4056 data_time: 0.0030 memory: 20334 grad_norm: 9.1583 loss: 0.6451 2023/06/01 05:43:47 - mmengine - INFO - Epoch(train) [17][3400/5758] lr: 1.0454e-04 eta: 2:16:48 time: 0.3879 data_time: 0.0024 memory: 20334 grad_norm: 41.2997 loss: 0.6426 2023/06/01 05:44:27 - mmengine - INFO - Epoch(train) [17][3500/5758] lr: 1.0454e-04 eta: 2:16:06 time: 0.3872 data_time: 0.0018 memory: 20334 grad_norm: 37.7454 loss: 0.6478 2023/06/01 05:45:07 - mmengine - INFO - Epoch(train) [17][3600/5758] lr: 1.0454e-04 eta: 2:15:24 time: 0.3866 data_time: 0.0028 memory: 20334 grad_norm: 18.1691 loss: 0.6497 2023/06/01 05:45:47 - mmengine - INFO - Epoch(train) [17][3700/5758] lr: 1.0454e-04 eta: 2:14:42 time: 0.4066 data_time: 0.0017 memory: 20334 grad_norm: 63.3308 loss: 0.6567 2023/06/01 05:46:27 - mmengine - INFO - Epoch(train) [17][3800/5758] lr: 1.0454e-04 eta: 2:14:00 time: 0.3708 data_time: 0.0019 memory: 20334 grad_norm: 37.7261 loss: 0.6425 2023/06/01 05:46:57 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 05:47:07 - mmengine - INFO - Epoch(train) [17][3900/5758] lr: 1.0454e-04 eta: 2:13:17 time: 0.3940 data_time: 0.0021 memory: 20334 grad_norm: 221.2627 loss: 0.6479 2023/06/01 05:47:48 - mmengine - INFO - Epoch(train) [17][4000/5758] lr: 1.0454e-04 eta: 2:12:35 time: 0.3877 data_time: 0.0024 memory: 20334 grad_norm: 16.5339 loss: 0.6504 2023/06/01 05:48:29 - mmengine - INFO - Epoch(train) [17][4100/5758] lr: 1.0454e-04 eta: 2:11:53 time: 0.3799 data_time: 0.0022 memory: 20334 grad_norm: 46.5200 loss: 0.6470 2023/06/01 05:49:10 - mmengine - INFO - Epoch(train) [17][4200/5758] lr: 1.0454e-04 eta: 2:11:11 time: 0.4132 data_time: 0.0019 memory: 20334 grad_norm: 45.2016 loss: 0.6488 2023/06/01 05:49:50 - mmengine - INFO - Epoch(train) [17][4300/5758] lr: 1.0454e-04 eta: 2:10:29 time: 0.4007 data_time: 0.0017 memory: 20334 grad_norm: 82.1602 loss: 0.6456 2023/06/01 05:50:30 - mmengine - INFO - Epoch(train) [17][4400/5758] lr: 1.0454e-04 eta: 2:09:47 time: 0.3833 data_time: 0.0016 memory: 20334 grad_norm: 13.5351 loss: 0.6291 2023/06/01 05:51:09 - mmengine - INFO - Epoch(train) [17][4500/5758] lr: 1.0454e-04 eta: 2:09:05 time: 0.3848 data_time: 0.0022 memory: 20334 grad_norm: 8.1859 loss: 0.6446 2023/06/01 05:51:50 - mmengine - INFO - Epoch(train) [17][4600/5758] lr: 1.0454e-04 eta: 2:08:23 time: 0.3876 data_time: 0.0016 memory: 20334 grad_norm: 12.5262 loss: 0.6456 2023/06/01 05:52:31 - mmengine - INFO - Epoch(train) [17][4700/5758] lr: 1.0454e-04 eta: 2:07:41 time: 0.4495 data_time: 0.0024 memory: 20334 grad_norm: 54.9705 loss: 0.6526 2023/06/01 05:53:12 - mmengine - INFO - Epoch(train) [17][4800/5758] lr: 1.0454e-04 eta: 2:06:59 time: 0.4094 data_time: 0.0026 memory: 20334 grad_norm: 27.8509 loss: 0.6469 2023/06/01 05:53:42 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 05:53:52 - mmengine - INFO - Epoch(train) [17][4900/5758] lr: 1.0454e-04 eta: 2:06:17 time: 0.4024 data_time: 0.0023 memory: 20334 grad_norm: 15.3742 loss: 0.6598 2023/06/01 05:54:33 - mmengine - INFO - Epoch(train) [17][5000/5758] lr: 1.0454e-04 eta: 2:05:35 time: 0.4276 data_time: 0.0020 memory: 20334 grad_norm: 23.7485 loss: 0.6477 2023/06/01 05:55:13 - mmengine - INFO - Epoch(train) [17][5100/5758] lr: 1.0454e-04 eta: 2:04:53 time: 0.3852 data_time: 0.0021 memory: 20334 grad_norm: 8.8197 loss: 0.6474 2023/06/01 05:55:53 - mmengine - INFO - Epoch(train) [17][5200/5758] lr: 1.0454e-04 eta: 2:04:11 time: 0.4023 data_time: 0.0018 memory: 20334 grad_norm: 37.9932 loss: 0.6425 2023/06/01 05:56:33 - mmengine - INFO - Epoch(train) [17][5300/5758] lr: 1.0454e-04 eta: 2:03:29 time: 0.4000 data_time: 0.0023 memory: 20334 grad_norm: 34.1676 loss: 0.6537 2023/06/01 05:57:14 - mmengine - INFO - Epoch(train) [17][5400/5758] lr: 1.0454e-04 eta: 2:02:47 time: 0.4286 data_time: 0.0026 memory: 20334 grad_norm: 48.1832 loss: 0.6532 2023/06/01 05:57:54 - mmengine - INFO - Epoch(train) [17][5500/5758] lr: 1.0454e-04 eta: 2:02:04 time: 0.3951 data_time: 0.0017 memory: 20334 grad_norm: 16.8611 loss: 0.6360 2023/06/01 05:58:34 - mmengine - INFO - Epoch(train) [17][5600/5758] lr: 1.0454e-04 eta: 2:01:22 time: 0.4180 data_time: 0.0026 memory: 20334 grad_norm: 43.3313 loss: 0.6361 2023/06/01 05:59:15 - mmengine - INFO - Epoch(train) [17][5700/5758] lr: 1.0454e-04 eta: 2:00:40 time: 0.4013 data_time: 0.0019 memory: 20334 grad_norm: 32.4200 loss: 0.6425 2023/06/01 05:59:38 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 05:59:38 - mmengine - INFO - Saving checkpoint at 17 epochs 2023/06/01 05:59:55 - mmengine - INFO - Epoch(val) [17][8/8] accuracy/top1: 88.1131 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [88.11312103271484, 0.0] single-label/f1-score_classwise: [93.68099212646484, 0.0] data_time: 0.3581 time: 0.9834 2023/06/01 06:00:41 - mmengine - INFO - Epoch(train) [18][ 100/5758] lr: 6.3952e-05 eta: 1:59:35 time: 0.4257 data_time: 0.0021 memory: 20334 grad_norm: 63.2827 loss: 0.6445 2023/06/01 06:00:47 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 06:01:22 - mmengine - INFO - Epoch(train) [18][ 200/5758] lr: 6.3952e-05 eta: 1:58:53 time: 0.3952 data_time: 0.0027 memory: 20334 grad_norm: 71.9475 loss: 0.6376 2023/06/01 06:02:03 - mmengine - INFO - Epoch(train) [18][ 300/5758] lr: 6.3952e-05 eta: 1:58:11 time: 0.4254 data_time: 0.0019 memory: 20334 grad_norm: 27.9691 loss: 0.6340 2023/06/01 06:02:43 - mmengine - INFO - Epoch(train) [18][ 400/5758] lr: 6.3952e-05 eta: 1:57:29 time: 0.4136 data_time: 0.0029 memory: 20334 grad_norm: 33.1160 loss: 0.6605 2023/06/01 06:03:22 - mmengine - INFO - Epoch(train) [18][ 500/5758] lr: 6.3952e-05 eta: 1:56:47 time: 0.3749 data_time: 0.0022 memory: 20334 grad_norm: 21.2225 loss: 0.6445 2023/06/01 06:04:03 - mmengine - INFO - Epoch(train) [18][ 600/5758] lr: 6.3952e-05 eta: 1:56:05 time: 0.4187 data_time: 0.0020 memory: 20334 grad_norm: 75.3499 loss: 0.6529 2023/06/01 06:04:43 - mmengine - INFO - Epoch(train) [18][ 700/5758] lr: 6.3952e-05 eta: 1:55:23 time: 0.4211 data_time: 0.0018 memory: 20334 grad_norm: 131.6462 loss: 0.6250 2023/06/01 06:05:23 - mmengine - INFO - Epoch(train) [18][ 800/5758] lr: 6.3952e-05 eta: 1:54:41 time: 0.3788 data_time: 0.0018 memory: 20334 grad_norm: 71.8548 loss: 0.6417 2023/06/01 06:06:04 - mmengine - INFO - Epoch(train) [18][ 900/5758] lr: 6.3952e-05 eta: 1:53:59 time: 0.3884 data_time: 0.0018 memory: 20334 grad_norm: 14.1204 loss: 0.6533 2023/06/01 06:06:44 - mmengine - INFO - Epoch(train) [18][1000/5758] lr: 6.3952e-05 eta: 1:53:17 time: 0.4122 data_time: 0.0025 memory: 20334 grad_norm: 82.0788 loss: 0.6442 2023/06/01 06:07:24 - mmengine - INFO - Epoch(train) [18][1100/5758] lr: 6.3952e-05 eta: 1:52:35 time: 0.4319 data_time: 0.0026 memory: 20334 grad_norm: 17.0746 loss: 0.6485 2023/06/01 06:07:30 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 06:08:05 - mmengine - INFO - Epoch(train) [18][1200/5758] lr: 6.3952e-05 eta: 1:51:53 time: 0.3998 data_time: 0.0019 memory: 20334 grad_norm: 12.5568 loss: 0.6545 2023/06/01 06:08:45 - mmengine - INFO - Epoch(train) [18][1300/5758] lr: 6.3952e-05 eta: 1:51:11 time: 0.3827 data_time: 0.0024 memory: 20334 grad_norm: 43.7911 loss: 0.6472 2023/06/01 06:09:26 - mmengine - INFO - Epoch(train) [18][1400/5758] lr: 6.3952e-05 eta: 1:50:29 time: 0.3796 data_time: 0.0030 memory: 20334 grad_norm: 19.4557 loss: 0.6481 2023/06/01 06:10:07 - mmengine - INFO - Epoch(train) [18][1500/5758] lr: 6.3952e-05 eta: 1:49:47 time: 0.4128 data_time: 0.0015 memory: 20334 grad_norm: 19.8311 loss: 0.6462 2023/06/01 06:10:49 - mmengine - INFO - Epoch(train) [18][1600/5758] lr: 6.3952e-05 eta: 1:49:05 time: 0.4163 data_time: 0.0018 memory: 20334 grad_norm: 30.9645 loss: 0.6447 2023/06/01 06:11:30 - mmengine - INFO - Epoch(train) [18][1700/5758] lr: 6.3952e-05 eta: 1:48:23 time: 0.4512 data_time: 0.0023 memory: 20334 grad_norm: 20.1364 loss: 0.6417 2023/06/01 06:12:10 - mmengine - INFO - Epoch(train) [18][1800/5758] lr: 6.3952e-05 eta: 1:47:41 time: 0.3844 data_time: 0.0025 memory: 20334 grad_norm: 32.4031 loss: 0.6447 2023/06/01 06:12:50 - mmengine - INFO - Epoch(train) [18][1900/5758] lr: 6.3952e-05 eta: 1:46:59 time: 0.3970 data_time: 0.0023 memory: 20334 grad_norm: 14.2658 loss: 0.6529 2023/06/01 06:13:31 - mmengine - INFO - Epoch(train) [18][2000/5758] lr: 6.3952e-05 eta: 1:46:17 time: 0.3929 data_time: 0.0021 memory: 20334 grad_norm: 20.6992 loss: 0.6427 2023/06/01 06:14:12 - mmengine - INFO - Epoch(train) [18][2100/5758] lr: 6.3952e-05 eta: 1:45:36 time: 0.4156 data_time: 0.0026 memory: 20334 grad_norm: 70.5440 loss: 0.6593 2023/06/01 06:14:18 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 06:14:52 - mmengine - INFO - Epoch(train) [18][2200/5758] lr: 6.3952e-05 eta: 1:44:54 time: 0.3908 data_time: 0.0024 memory: 20334 grad_norm: 21.3339 loss: 0.6493 2023/06/01 06:15:33 - mmengine - INFO - Epoch(train) [18][2300/5758] lr: 6.3952e-05 eta: 1:44:12 time: 0.3997 data_time: 0.0027 memory: 20334 grad_norm: 27.2712 loss: 0.6560 2023/06/01 06:16:13 - mmengine - INFO - Epoch(train) [18][2400/5758] lr: 6.3952e-05 eta: 1:43:30 time: 0.3754 data_time: 0.0022 memory: 20334 grad_norm: 25.0591 loss: 0.6598 2023/06/01 06:16:54 - mmengine - INFO - Epoch(train) [18][2500/5758] lr: 6.3952e-05 eta: 1:42:48 time: 0.3689 data_time: 0.0018 memory: 20334 grad_norm: 26.6567 loss: 0.6417 2023/06/01 06:17:35 - mmengine - INFO - Epoch(train) [18][2600/5758] lr: 6.3952e-05 eta: 1:42:06 time: 0.3946 data_time: 0.0029 memory: 20334 grad_norm: 44.6973 loss: 0.6490 2023/06/01 06:18:15 - mmengine - INFO - Epoch(train) [18][2700/5758] lr: 6.3952e-05 eta: 1:41:24 time: 0.3943 data_time: 0.0018 memory: 20334 grad_norm: 225.2489 loss: 0.6620 2023/06/01 06:18:56 - mmengine - INFO - Epoch(train) [18][2800/5758] lr: 6.3952e-05 eta: 1:40:42 time: 0.3994 data_time: 0.0018 memory: 20334 grad_norm: 83.4851 loss: 0.6607 2023/06/01 06:19:37 - mmengine - INFO - Epoch(train) [18][2900/5758] lr: 6.3952e-05 eta: 1:40:00 time: 0.3779 data_time: 0.0022 memory: 20334 grad_norm: 44.0406 loss: 0.6465 2023/06/01 06:20:18 - mmengine - INFO - Epoch(train) [18][3000/5758] lr: 6.3952e-05 eta: 1:39:18 time: 0.4228 data_time: 0.0032 memory: 20334 grad_norm: 24.5456 loss: 0.6401 2023/06/01 06:20:59 - mmengine - INFO - Epoch(train) [18][3100/5758] lr: 6.3952e-05 eta: 1:38:37 time: 0.3933 data_time: 0.0018 memory: 20334 grad_norm: 128.9199 loss: 0.6634 2023/06/01 06:21:05 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 06:21:41 - mmengine - INFO - Epoch(train) [18][3200/5758] lr: 6.3952e-05 eta: 1:37:55 time: 0.4023 data_time: 0.0022 memory: 20334 grad_norm: 152.1927 loss: 0.6575 2023/06/01 06:22:23 - mmengine - INFO - Epoch(train) [18][3300/5758] lr: 6.3952e-05 eta: 1:37:13 time: 0.3871 data_time: 0.0017 memory: 20334 grad_norm: 34.5417 loss: 0.6583 2023/06/01 06:23:04 - mmengine - INFO - Epoch(train) [18][3400/5758] lr: 6.3952e-05 eta: 1:36:31 time: 0.3989 data_time: 0.0022 memory: 20334 grad_norm: 10.8451 loss: 0.6594 2023/06/01 06:23:45 - mmengine - INFO - Epoch(train) [18][3500/5758] lr: 6.3952e-05 eta: 1:35:49 time: 0.4142 data_time: 0.0020 memory: 20334 grad_norm: 136.5663 loss: 0.6515 2023/06/01 06:24:25 - mmengine - INFO - Epoch(train) [18][3600/5758] lr: 6.3952e-05 eta: 1:35:08 time: 0.3946 data_time: 0.0016 memory: 20334 grad_norm: 26.9938 loss: 0.6600 2023/06/01 06:25:06 - mmengine - INFO - Epoch(train) [18][3700/5758] lr: 6.3952e-05 eta: 1:34:26 time: 0.3926 data_time: 0.0021 memory: 20334 grad_norm: 32.2152 loss: 0.6732 2023/06/01 06:25:47 - mmengine - INFO - Epoch(train) [18][3800/5758] lr: 6.3952e-05 eta: 1:33:44 time: 0.4148 data_time: 0.0030 memory: 20334 grad_norm: 63.9369 loss: 0.6618 2023/06/01 06:26:28 - mmengine - INFO - Epoch(train) [18][3900/5758] lr: 6.3952e-05 eta: 1:33:02 time: 0.4113 data_time: 0.0028 memory: 20334 grad_norm: 25.9544 loss: 0.6741 2023/06/01 06:27:08 - mmengine - INFO - Epoch(train) [18][4000/5758] lr: 6.3952e-05 eta: 1:32:20 time: 0.4400 data_time: 0.0018 memory: 20334 grad_norm: 71.6971 loss: 0.6596 2023/06/01 06:27:49 - mmengine - INFO - Epoch(train) [18][4100/5758] lr: 6.3952e-05 eta: 1:31:38 time: 0.3902 data_time: 0.0021 memory: 20334 grad_norm: 46.5575 loss: 0.6668 2023/06/01 06:27:55 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 06:28:31 - mmengine - INFO - Epoch(train) [18][4200/5758] lr: 6.3952e-05 eta: 1:30:56 time: 0.4282 data_time: 0.0022 memory: 20334 grad_norm: 102.1191 loss: 0.6648 2023/06/01 06:29:12 - mmengine - INFO - Epoch(train) [18][4300/5758] lr: 6.3952e-05 eta: 1:30:15 time: 0.3854 data_time: 0.0017 memory: 20334 grad_norm: 91.0550 loss: 0.6600 2023/06/01 06:29:53 - mmengine - INFO - Epoch(train) [18][4400/5758] lr: 6.3952e-05 eta: 1:29:33 time: 0.3878 data_time: 0.0019 memory: 20334 grad_norm: 40.9992 loss: 0.6546 2023/06/01 06:30:34 - mmengine - INFO - Epoch(train) [18][4500/5758] lr: 6.3952e-05 eta: 1:28:51 time: 0.4111 data_time: 0.0018 memory: 20334 grad_norm: 58.0564 loss: 0.6499 2023/06/01 06:31:15 - mmengine - INFO - Epoch(train) [18][4600/5758] lr: 6.3952e-05 eta: 1:28:09 time: 0.4121 data_time: 0.0021 memory: 20334 grad_norm: 38.4158 loss: 0.6537 2023/06/01 06:31:56 - mmengine - INFO - Epoch(train) [18][4700/5758] lr: 6.3952e-05 eta: 1:27:27 time: 0.4433 data_time: 0.0022 memory: 20334 grad_norm: 47.8747 loss: 0.6522 2023/06/01 06:32:37 - mmengine - INFO - Epoch(train) [18][4800/5758] lr: 6.3952e-05 eta: 1:26:45 time: 0.4109 data_time: 0.0029 memory: 20334 grad_norm: 42.6568 loss: 0.6453 2023/06/01 06:33:18 - mmengine - INFO - Epoch(train) [18][4900/5758] lr: 6.3952e-05 eta: 1:26:04 time: 0.4342 data_time: 0.0020 memory: 20334 grad_norm: 59.3426 loss: 0.6423 2023/06/01 06:33:58 - mmengine - INFO - Epoch(train) [18][5000/5758] lr: 6.3952e-05 eta: 1:25:22 time: 0.3906 data_time: 0.0017 memory: 20334 grad_norm: 15.6146 loss: 0.6566 2023/06/01 06:34:37 - mmengine - INFO - Epoch(train) [18][5100/5758] lr: 6.3952e-05 eta: 1:24:40 time: 0.3962 data_time: 0.0027 memory: 20334 grad_norm: 21.6596 loss: 0.6484 2023/06/01 06:34:44 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 06:35:18 - mmengine - INFO - Epoch(train) [18][5200/5758] lr: 6.3952e-05 eta: 1:23:58 time: 0.3806 data_time: 0.0023 memory: 20334 grad_norm: 296.3698 loss: 0.6488 2023/06/01 06:35:59 - mmengine - INFO - Epoch(train) [18][5300/5758] lr: 6.3952e-05 eta: 1:23:16 time: 0.4201 data_time: 0.0024 memory: 20334 grad_norm: 171.4270 loss: 0.6475 2023/06/01 06:36:42 - mmengine - INFO - Epoch(train) [18][5400/5758] lr: 6.3952e-05 eta: 1:22:34 time: 0.4169 data_time: 0.0020 memory: 20334 grad_norm: 15.7709 loss: 0.6602 2023/06/01 06:37:22 - mmengine - INFO - Epoch(train) [18][5500/5758] lr: 6.3952e-05 eta: 1:21:52 time: 0.4175 data_time: 0.0030 memory: 20334 grad_norm: 15.7511 loss: 0.6604 2023/06/01 06:38:03 - mmengine - INFO - Epoch(train) [18][5600/5758] lr: 6.3952e-05 eta: 1:21:11 time: 0.4094 data_time: 0.0024 memory: 20334 grad_norm: 16.8459 loss: 0.6537 2023/06/01 06:38:44 - mmengine - INFO - Epoch(train) [18][5700/5758] lr: 6.3952e-05 eta: 1:20:29 time: 0.4027 data_time: 0.0019 memory: 20334 grad_norm: 27.6579 loss: 0.6491 2023/06/01 06:39:07 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 06:39:07 - mmengine - INFO - Saving checkpoint at 18 epochs 2023/06/01 06:39:24 - mmengine - INFO - Epoch(val) [18][8/8] accuracy/top1: 86.9895 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [86.98945617675781, 0.0] single-label/f1-score_classwise: [93.0420913696289, 0.0] data_time: 0.3680 time: 0.9957 2023/06/01 06:40:09 - mmengine - INFO - Epoch(train) [19][ 100/5758] lr: 3.4227e-05 eta: 1:19:23 time: 0.4029 data_time: 0.0020 memory: 20334 grad_norm: 66.6539 loss: 0.6515 2023/06/01 06:40:49 - mmengine - INFO - Epoch(train) [19][ 200/5758] lr: 3.4227e-05 eta: 1:18:41 time: 0.4083 data_time: 0.0019 memory: 20334 grad_norm: 34.7455 loss: 0.6644 2023/06/01 06:41:31 - mmengine - INFO - Epoch(train) [19][ 300/5758] lr: 3.4227e-05 eta: 1:18:00 time: 0.3937 data_time: 0.0020 memory: 20334 grad_norm: 213.3387 loss: 0.6749 2023/06/01 06:41:54 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 06:42:12 - mmengine - INFO - Epoch(train) [19][ 400/5758] lr: 3.4227e-05 eta: 1:17:18 time: 0.4123 data_time: 0.0018 memory: 20334 grad_norm: 74.3776 loss: 0.6606 2023/06/01 06:42:53 - mmengine - INFO - Epoch(train) [19][ 500/5758] lr: 3.4227e-05 eta: 1:16:36 time: 0.4405 data_time: 0.0019 memory: 20334 grad_norm: 89.7455 loss: 0.6422 2023/06/01 06:43:34 - mmengine - INFO - Epoch(train) [19][ 600/5758] lr: 3.4227e-05 eta: 1:15:54 time: 0.4268 data_time: 0.0017 memory: 20334 grad_norm: 40.8558 loss: 0.6535 2023/06/01 06:44:14 - mmengine - INFO - Epoch(train) [19][ 700/5758] lr: 3.4227e-05 eta: 1:15:12 time: 0.3833 data_time: 0.0018 memory: 20334 grad_norm: 15.9707 loss: 0.6586 2023/06/01 06:44:54 - mmengine - INFO - Epoch(train) [19][ 800/5758] lr: 3.4227e-05 eta: 1:14:30 time: 0.4180 data_time: 0.0018 memory: 20334 grad_norm: 18.2879 loss: 0.6529 2023/06/01 06:45:35 - mmengine - INFO - Epoch(train) [19][ 900/5758] lr: 3.4227e-05 eta: 1:13:49 time: 0.3997 data_time: 0.0018 memory: 20334 grad_norm: 14.3042 loss: 0.6278 2023/06/01 06:46:16 - mmengine - INFO - Epoch(train) [19][1000/5758] lr: 3.4227e-05 eta: 1:13:07 time: 0.3913 data_time: 0.0018 memory: 20334 grad_norm: 9.4428 loss: 0.6414 2023/06/01 06:46:58 - mmengine - INFO - Epoch(train) [19][1100/5758] lr: 3.4227e-05 eta: 1:12:25 time: 0.3959 data_time: 0.0019 memory: 20334 grad_norm: 40.7144 loss: 0.6593 2023/06/01 06:47:38 - mmengine - INFO - Epoch(train) [19][1200/5758] lr: 3.4227e-05 eta: 1:11:43 time: 0.3841 data_time: 0.0023 memory: 20334 grad_norm: 13.6276 loss: 0.6553 2023/06/01 06:48:19 - mmengine - INFO - Epoch(train) [19][1300/5758] lr: 3.4227e-05 eta: 1:11:01 time: 0.4204 data_time: 0.0023 memory: 20334 grad_norm: 14.2798 loss: 0.6425 2023/06/01 06:48:43 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 06:49:01 - mmengine - INFO - Epoch(train) [19][1400/5758] lr: 3.4227e-05 eta: 1:10:20 time: 0.4097 data_time: 0.0019 memory: 20334 grad_norm: 61.6538 loss: 0.6414 2023/06/01 06:49:43 - mmengine - INFO - Epoch(train) [19][1500/5758] lr: 3.4227e-05 eta: 1:09:38 time: 0.3920 data_time: 0.0017 memory: 20334 grad_norm: 19.8346 loss: 0.6525 2023/06/01 06:50:24 - mmengine - INFO - Epoch(train) [19][1600/5758] lr: 3.4227e-05 eta: 1:08:56 time: 0.3671 data_time: 0.0021 memory: 20334 grad_norm: 21.9790 loss: 0.6535 2023/06/01 06:51:05 - mmengine - INFO - Epoch(train) [19][1700/5758] lr: 3.4227e-05 eta: 1:08:14 time: 0.4107 data_time: 0.0027 memory: 20334 grad_norm: 9.2302 loss: 0.6392 2023/06/01 06:51:47 - mmengine - INFO - Epoch(train) [19][1800/5758] lr: 3.4227e-05 eta: 1:07:33 time: 0.4140 data_time: 0.0027 memory: 20334 grad_norm: 25.4786 loss: 0.6501 2023/06/01 06:52:29 - mmengine - INFO - Epoch(train) [19][1900/5758] lr: 3.4227e-05 eta: 1:06:51 time: 0.4219 data_time: 0.0018 memory: 20334 grad_norm: 31.8239 loss: 0.6490 2023/06/01 06:53:10 - mmengine - INFO - Epoch(train) [19][2000/5758] lr: 3.4227e-05 eta: 1:06:09 time: 0.3887 data_time: 0.0018 memory: 20334 grad_norm: 42.6173 loss: 0.6465 2023/06/01 06:53:51 - mmengine - INFO - Epoch(train) [19][2100/5758] lr: 3.4227e-05 eta: 1:05:27 time: 0.4121 data_time: 0.0020 memory: 20334 grad_norm: 23.1560 loss: 0.6574 2023/06/01 06:54:31 - mmengine - INFO - Epoch(train) [19][2200/5758] lr: 3.4227e-05 eta: 1:04:46 time: 0.4002 data_time: 0.0023 memory: 20334 grad_norm: 22.8971 loss: 0.6566 2023/06/01 06:55:12 - mmengine - INFO - Epoch(train) [19][2300/5758] lr: 3.4227e-05 eta: 1:04:04 time: 0.3821 data_time: 0.0017 memory: 20334 grad_norm: 30.9176 loss: 0.6556 2023/06/01 06:55:34 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 06:55:52 - mmengine - INFO - Epoch(train) [19][2400/5758] lr: 3.4227e-05 eta: 1:03:22 time: 0.3650 data_time: 0.0019 memory: 20334 grad_norm: 41.5924 loss: 0.6587 2023/06/01 06:56:33 - mmengine - INFO - Epoch(train) [19][2500/5758] lr: 3.4227e-05 eta: 1:02:40 time: 0.4692 data_time: 0.0019 memory: 20334 grad_norm: 33.9543 loss: 0.6456 2023/06/01 06:57:12 - mmengine - INFO - Epoch(train) [19][2600/5758] lr: 3.4227e-05 eta: 1:01:58 time: 0.3927 data_time: 0.0018 memory: 20334 grad_norm: 40.9530 loss: 0.6533 2023/06/01 06:57:54 - mmengine - INFO - Epoch(train) [19][2700/5758] lr: 3.4227e-05 eta: 1:01:17 time: 0.4043 data_time: 0.0018 memory: 20334 grad_norm: 14.2156 loss: 0.6593 2023/06/01 06:58:34 - mmengine - INFO - Epoch(train) [19][2800/5758] lr: 3.4227e-05 eta: 1:00:35 time: 0.4019 data_time: 0.0020 memory: 20334 grad_norm: 143.3173 loss: 0.6481 2023/06/01 06:59:16 - mmengine - INFO - Epoch(train) [19][2900/5758] lr: 3.4227e-05 eta: 0:59:53 time: 0.4302 data_time: 0.0020 memory: 20334 grad_norm: 40.7847 loss: 0.6658 2023/06/01 06:59:57 - mmengine - INFO - Epoch(train) [19][3000/5758] lr: 3.4227e-05 eta: 0:59:11 time: 0.4114 data_time: 0.0019 memory: 20334 grad_norm: 19.6428 loss: 0.6577 2023/06/01 07:00:37 - mmengine - INFO - Epoch(train) [19][3100/5758] lr: 3.4227e-05 eta: 0:58:29 time: 0.4089 data_time: 0.0015 memory: 20334 grad_norm: 75.8778 loss: 0.6442 2023/06/01 07:01:18 - mmengine - INFO - Epoch(train) [19][3200/5758] lr: 3.4227e-05 eta: 0:57:48 time: 0.4103 data_time: 0.0015 memory: 20334 grad_norm: 12.3155 loss: 0.6522 2023/06/01 07:01:58 - mmengine - INFO - Epoch(train) [19][3300/5758] lr: 3.4227e-05 eta: 0:57:06 time: 0.4080 data_time: 0.0017 memory: 20334 grad_norm: 14.5197 loss: 0.6573 2023/06/01 07:02:21 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 07:02:38 - mmengine - INFO - Epoch(train) [19][3400/5758] lr: 3.4227e-05 eta: 0:56:24 time: 0.3837 data_time: 0.0022 memory: 20334 grad_norm: 49.5584 loss: 0.6521 2023/06/01 07:03:19 - mmengine - INFO - Epoch(train) [19][3500/5758] lr: 3.4227e-05 eta: 0:55:42 time: 0.3867 data_time: 0.0017 memory: 20334 grad_norm: 34.7282 loss: 0.6657 2023/06/01 07:04:00 - mmengine - INFO - Epoch(train) [19][3600/5758] lr: 3.4227e-05 eta: 0:55:00 time: 0.3831 data_time: 0.0016 memory: 20334 grad_norm: 17.0216 loss: 0.6684 2023/06/01 07:04:41 - mmengine - INFO - Epoch(train) [19][3700/5758] lr: 3.4227e-05 eta: 0:54:19 time: 0.3945 data_time: 0.0017 memory: 20334 grad_norm: 30.0264 loss: 0.6665 2023/06/01 07:05:22 - mmengine - INFO - Epoch(train) [19][3800/5758] lr: 3.4227e-05 eta: 0:53:37 time: 0.3869 data_time: 0.0016 memory: 20334 grad_norm: 19.3257 loss: 0.6644 2023/06/01 07:06:03 - mmengine - INFO - Epoch(train) [19][3900/5758] lr: 3.4227e-05 eta: 0:52:55 time: 0.4218 data_time: 0.0016 memory: 20334 grad_norm: 36.1686 loss: 0.6727 2023/06/01 07:06:43 - mmengine - INFO - Epoch(train) [19][4000/5758] lr: 3.4227e-05 eta: 0:52:13 time: 0.4407 data_time: 0.0017 memory: 20334 grad_norm: 12.6983 loss: 0.6665 2023/06/01 07:07:23 - mmengine - INFO - Epoch(train) [19][4100/5758] lr: 3.4227e-05 eta: 0:51:32 time: 0.4180 data_time: 0.0016 memory: 20334 grad_norm: 18.5814 loss: 0.6727 2023/06/01 07:08:05 - mmengine - INFO - Epoch(train) [19][4200/5758] lr: 3.4227e-05 eta: 0:50:50 time: 0.4268 data_time: 0.0016 memory: 20334 grad_norm: 40.6030 loss: 0.6681 2023/06/01 07:08:44 - mmengine - INFO - Epoch(train) [19][4300/5758] lr: 3.4227e-05 eta: 0:50:08 time: 0.4036 data_time: 0.0016 memory: 20334 grad_norm: 85.0285 loss: 0.6686 2023/06/01 07:09:08 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 07:09:26 - mmengine - INFO - Epoch(train) [19][4400/5758] lr: 3.4227e-05 eta: 0:49:26 time: 0.4441 data_time: 0.0016 memory: 20334 grad_norm: 62.7240 loss: 0.6558 2023/06/01 07:10:07 - mmengine - INFO - Epoch(train) [19][4500/5758] lr: 3.4227e-05 eta: 0:48:45 time: 0.4474 data_time: 0.0018 memory: 20334 grad_norm: 99.8403 loss: 0.6742 2023/06/01 07:10:47 - mmengine - INFO - Epoch(train) [19][4600/5758] lr: 3.4227e-05 eta: 0:48:03 time: 0.4304 data_time: 0.0016 memory: 20334 grad_norm: 57.7155 loss: 0.6722 2023/06/01 07:11:28 - mmengine - INFO - Epoch(train) [19][4700/5758] lr: 3.4227e-05 eta: 0:47:21 time: 0.4323 data_time: 0.0017 memory: 20334 grad_norm: 80.3763 loss: 0.6625 2023/06/01 07:12:08 - mmengine - INFO - Epoch(train) [19][4800/5758] lr: 3.4227e-05 eta: 0:46:39 time: 0.3881 data_time: 0.0020 memory: 20334 grad_norm: 62.8092 loss: 0.6744 2023/06/01 07:12:50 - mmengine - INFO - Epoch(train) [19][4900/5758] lr: 3.4227e-05 eta: 0:45:58 time: 0.3931 data_time: 0.0020 memory: 20334 grad_norm: 31.4577 loss: 0.6706 2023/06/01 07:13:31 - mmengine - INFO - Epoch(train) [19][5000/5758] lr: 3.4227e-05 eta: 0:45:16 time: 0.4111 data_time: 0.0020 memory: 20334 grad_norm: 56.2091 loss: 0.6599 2023/06/01 07:14:12 - mmengine - INFO - Epoch(train) [19][5100/5758] lr: 3.4227e-05 eta: 0:44:34 time: 0.4001 data_time: 0.0014 memory: 20334 grad_norm: 206.1507 loss: 0.6731 2023/06/01 07:14:53 - mmengine - INFO - Epoch(train) [19][5200/5758] lr: 3.4227e-05 eta: 0:43:52 time: 0.3867 data_time: 0.0036 memory: 20334 grad_norm: 44.8787 loss: 0.6735 2023/06/01 07:15:33 - mmengine - INFO - Epoch(train) [19][5300/5758] lr: 3.4227e-05 eta: 0:43:11 time: 0.4130 data_time: 0.0024 memory: 20334 grad_norm: 79.7090 loss: 0.6719 2023/06/01 07:15:57 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 07:16:15 - mmengine - INFO - Epoch(train) [19][5400/5758] lr: 3.4227e-05 eta: 0:42:29 time: 0.4001 data_time: 0.0019 memory: 20334 grad_norm: 58.0159 loss: 0.6580 2023/06/01 07:16:55 - mmengine - INFO - Epoch(train) [19][5500/5758] lr: 3.4227e-05 eta: 0:41:47 time: 0.3854 data_time: 0.0022 memory: 20334 grad_norm: 86.6426 loss: 0.6738 2023/06/01 07:17:36 - mmengine - INFO - Epoch(train) [19][5600/5758] lr: 3.4227e-05 eta: 0:41:05 time: 0.3885 data_time: 0.0017 memory: 20334 grad_norm: 38.4154 loss: 0.6705 2023/06/01 07:18:17 - mmengine - INFO - Epoch(train) [19][5700/5758] lr: 3.4227e-05 eta: 0:40:24 time: 0.3995 data_time: 0.0018 memory: 20334 grad_norm: 203.1712 loss: 0.6734 2023/06/01 07:18:40 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 07:18:40 - mmengine - INFO - Saving checkpoint at 19 epochs 2023/06/01 07:18:56 - mmengine - INFO - Epoch(val) [19][8/8] accuracy/top1: 85.8342 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [85.834228515625, 0.0] single-label/f1-score_classwise: [92.377197265625, 0.0] data_time: 0.3582 time: 0.9835 2023/06/01 07:19:41 - mmengine - INFO - Epoch(train) [20][ 100/5758] lr: 1.6094e-05 eta: 0:39:18 time: 0.3837 data_time: 0.0018 memory: 20334 grad_norm: 37.8316 loss: 0.6657 2023/06/01 07:20:23 - mmengine - INFO - Epoch(train) [20][ 200/5758] lr: 1.6094e-05 eta: 0:38:36 time: 0.3624 data_time: 0.0023 memory: 20334 grad_norm: 131.8389 loss: 0.6685 2023/06/01 07:21:04 - mmengine - INFO - Epoch(train) [20][ 300/5758] lr: 1.6094e-05 eta: 0:37:55 time: 0.4057 data_time: 0.0032 memory: 20334 grad_norm: 73.4937 loss: 0.6725 2023/06/01 07:21:44 - mmengine - INFO - Epoch(train) [20][ 400/5758] lr: 1.6094e-05 eta: 0:37:13 time: 0.3943 data_time: 0.0029 memory: 20334 grad_norm: 48.6913 loss: 0.6698 2023/06/01 07:22:26 - mmengine - INFO - Epoch(train) [20][ 500/5758] lr: 1.6094e-05 eta: 0:36:31 time: 0.3894 data_time: 0.0024 memory: 20334 grad_norm: 47.0496 loss: 0.6681 2023/06/01 07:23:05 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 07:23:06 - mmengine - INFO - Epoch(train) [20][ 600/5758] lr: 1.6094e-05 eta: 0:35:49 time: 0.3859 data_time: 0.0024 memory: 20334 grad_norm: 1127.3899 loss: 0.6733 2023/06/01 07:23:47 - mmengine - INFO - Epoch(train) [20][ 700/5758] lr: 1.6094e-05 eta: 0:35:08 time: 0.4176 data_time: 0.0024 memory: 20334 grad_norm: 82.1451 loss: 0.6699 2023/06/01 07:24:29 - mmengine - INFO - Epoch(train) [20][ 800/5758] lr: 1.6094e-05 eta: 0:34:26 time: 0.4404 data_time: 0.0017 memory: 20334 grad_norm: 11.3550 loss: 0.6541 2023/06/01 07:25:09 - mmengine - INFO - Epoch(train) [20][ 900/5758] lr: 1.6094e-05 eta: 0:33:44 time: 0.3815 data_time: 0.0021 memory: 20334 grad_norm: 44.5827 loss: 0.6699 2023/06/01 07:25:49 - mmengine - INFO - Epoch(train) [20][1000/5758] lr: 1.6094e-05 eta: 0:33:03 time: 0.3839 data_time: 0.0020 memory: 20334 grad_norm: 181.1916 loss: 0.6704 2023/06/01 07:26:31 - mmengine - INFO - Epoch(train) [20][1100/5758] lr: 1.6094e-05 eta: 0:32:21 time: 0.4080 data_time: 0.0026 memory: 20334 grad_norm: 29.3792 loss: 0.6695 2023/06/01 07:27:11 - mmengine - INFO - Epoch(train) [20][1200/5758] lr: 1.6094e-05 eta: 0:31:39 time: 0.3870 data_time: 0.0019 memory: 20334 grad_norm: 112.6470 loss: 0.6738 2023/06/01 07:27:51 - mmengine - INFO - Epoch(train) [20][1300/5758] lr: 1.6094e-05 eta: 0:30:57 time: 0.3776 data_time: 0.0025 memory: 20334 grad_norm: 97.6848 loss: 0.6756 2023/06/01 07:28:31 - mmengine - INFO - Epoch(train) [20][1400/5758] lr: 1.6094e-05 eta: 0:30:16 time: 0.3918 data_time: 0.0017 memory: 20334 grad_norm: 71.3823 loss: 0.6758 2023/06/01 07:29:13 - mmengine - INFO - Epoch(train) [20][1500/5758] lr: 1.6094e-05 eta: 0:29:34 time: 0.4175 data_time: 0.0031 memory: 20334 grad_norm: 12.3704 loss: 0.6698 2023/06/01 07:29:53 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 07:29:54 - mmengine - INFO - Epoch(train) [20][1600/5758] lr: 1.6094e-05 eta: 0:28:52 time: 0.4094 data_time: 0.0020 memory: 20334 grad_norm: 217.7896 loss: 0.6712 2023/06/01 07:30:35 - mmengine - INFO - Epoch(train) [20][1700/5758] lr: 1.6094e-05 eta: 0:28:11 time: 0.4169 data_time: 0.0020 memory: 20334 grad_norm: 75.2327 loss: 0.6697 2023/06/01 07:31:16 - mmengine - INFO - Epoch(train) [20][1800/5758] lr: 1.6094e-05 eta: 0:27:29 time: 0.3831 data_time: 0.0026 memory: 20334 grad_norm: 49.7328 loss: 0.6667 2023/06/01 07:31:56 - mmengine - INFO - Epoch(train) [20][1900/5758] lr: 1.6094e-05 eta: 0:26:47 time: 0.3818 data_time: 0.0018 memory: 20334 grad_norm: 33.9477 loss: 0.6709 2023/06/01 07:32:37 - mmengine - INFO - Epoch(train) [20][2000/5758] lr: 1.6094e-05 eta: 0:26:05 time: 0.3901 data_time: 0.0018 memory: 20334 grad_norm: 148.2320 loss: 0.6696 2023/06/01 07:33:17 - mmengine - INFO - Epoch(train) [20][2100/5758] lr: 1.6094e-05 eta: 0:25:24 time: 0.3885 data_time: 0.0021 memory: 20334 grad_norm: 24.1435 loss: 0.6764 2023/06/01 07:33:57 - mmengine - INFO - Epoch(train) [20][2200/5758] lr: 1.6094e-05 eta: 0:24:42 time: 0.3702 data_time: 0.0018 memory: 20334 grad_norm: 29.8747 loss: 0.6745 2023/06/01 07:34:39 - mmengine - INFO - Epoch(train) [20][2300/5758] lr: 1.6094e-05 eta: 0:24:00 time: 0.4365 data_time: 0.0020 memory: 20334 grad_norm: 91.6214 loss: 0.6710 2023/06/01 07:35:20 - mmengine - INFO - Epoch(train) [20][2400/5758] lr: 1.6094e-05 eta: 0:23:19 time: 0.4153 data_time: 0.0017 memory: 20334 grad_norm: 86.2144 loss: 0.6671 2023/06/01 07:36:01 - mmengine - INFO - Epoch(train) [20][2500/5758] lr: 1.6094e-05 eta: 0:22:37 time: 0.4084 data_time: 0.0026 memory: 20334 grad_norm: 304.8029 loss: 0.6622 2023/06/01 07:36:41 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 07:36:41 - mmengine - INFO - Epoch(train) [20][2600/5758] lr: 1.6094e-05 eta: 0:21:55 time: 0.4042 data_time: 0.0022 memory: 20334 grad_norm: 97.6951 loss: 0.6740 2023/06/01 07:37:22 - mmengine - INFO - Epoch(train) [20][2700/5758] lr: 1.6094e-05 eta: 0:21:14 time: 0.3819 data_time: 0.0019 memory: 20334 grad_norm: 93.4661 loss: 0.6579 2023/06/01 07:38:02 - mmengine - INFO - Epoch(train) [20][2800/5758] lr: 1.6094e-05 eta: 0:20:32 time: 0.3765 data_time: 0.0019 memory: 20334 grad_norm: 38.1375 loss: 0.6646 2023/06/01 07:38:45 - mmengine - INFO - Epoch(train) [20][2900/5758] lr: 1.6094e-05 eta: 0:19:50 time: 0.4129 data_time: 0.0021 memory: 20334 grad_norm: 38.6878 loss: 0.6653 2023/06/01 07:39:25 - mmengine - INFO - Epoch(train) [20][3000/5758] lr: 1.6094e-05 eta: 0:19:09 time: 0.3859 data_time: 0.0021 memory: 20334 grad_norm: 22.5367 loss: 0.6759 2023/06/01 07:40:06 - mmengine - INFO - Epoch(train) [20][3100/5758] lr: 1.6094e-05 eta: 0:18:27 time: 0.4300 data_time: 0.0021 memory: 20334 grad_norm: 51.0495 loss: 0.6673 2023/06/01 07:40:48 - mmengine - INFO - Epoch(train) [20][3200/5758] lr: 1.6094e-05 eta: 0:17:45 time: 0.4007 data_time: 0.0033 memory: 20334 grad_norm: 46.9795 loss: 0.6686 2023/06/01 07:41:27 - mmengine - INFO - Epoch(train) [20][3300/5758] lr: 1.6094e-05 eta: 0:17:03 time: 0.3918 data_time: 0.0025 memory: 20334 grad_norm: 94.8533 loss: 0.6783 2023/06/01 07:42:09 - mmengine - INFO - Epoch(train) [20][3400/5758] lr: 1.6094e-05 eta: 0:16:22 time: 0.3831 data_time: 0.0033 memory: 20334 grad_norm: 36.0965 loss: 0.6774 2023/06/01 07:42:50 - mmengine - INFO - Epoch(train) [20][3500/5758] lr: 1.6094e-05 eta: 0:15:40 time: 0.3858 data_time: 0.0018 memory: 20334 grad_norm: 14.7771 loss: 0.6693 2023/06/01 07:43:31 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 07:43:32 - mmengine - INFO - Epoch(train) [20][3600/5758] lr: 1.6094e-05 eta: 0:14:58 time: 0.4007 data_time: 0.0025 memory: 20334 grad_norm: 48.9178 loss: 0.6821 2023/06/01 07:44:12 - mmengine - INFO - Epoch(train) [20][3700/5758] lr: 1.6094e-05 eta: 0:14:17 time: 0.4178 data_time: 0.0030 memory: 20334 grad_norm: 41.1830 loss: 0.6624 2023/06/01 07:44:52 - mmengine - INFO - Epoch(train) [20][3800/5758] lr: 1.6094e-05 eta: 0:13:35 time: 0.3991 data_time: 0.0019 memory: 20334 grad_norm: 49.4230 loss: 0.6683 2023/06/01 07:45:33 - mmengine - INFO - Epoch(train) [20][3900/5758] lr: 1.6094e-05 eta: 0:12:53 time: 0.3863 data_time: 0.0018 memory: 20334 grad_norm: 61.3406 loss: 0.6697 2023/06/01 07:46:14 - mmengine - INFO - Epoch(train) [20][4000/5758] lr: 1.6094e-05 eta: 0:12:12 time: 0.4347 data_time: 0.0020 memory: 20334 grad_norm: 20.7913 loss: 0.6625 2023/06/01 07:46:55 - mmengine - INFO - Epoch(train) [20][4100/5758] lr: 1.6094e-05 eta: 0:11:30 time: 0.4153 data_time: 0.0023 memory: 20334 grad_norm: 58.9290 loss: 0.6697 2023/06/01 07:47:36 - mmengine - INFO - Epoch(train) [20][4200/5758] lr: 1.6094e-05 eta: 0:10:48 time: 0.3967 data_time: 0.0021 memory: 20334 grad_norm: 18.9529 loss: 0.6848 2023/06/01 07:48:16 - mmengine - INFO - Epoch(train) [20][4300/5758] lr: 1.6094e-05 eta: 0:10:07 time: 0.3866 data_time: 0.0022 memory: 20334 grad_norm: 20.6123 loss: 0.6800 2023/06/01 07:48:57 - mmengine - INFO - Epoch(train) [20][4400/5758] lr: 1.6094e-05 eta: 0:09:25 time: 0.4148 data_time: 0.0017 memory: 20334 grad_norm: 46.0141 loss: 0.6571 2023/06/01 07:49:39 - mmengine - INFO - Epoch(train) [20][4500/5758] lr: 1.6094e-05 eta: 0:08:43 time: 0.4024 data_time: 0.0020 memory: 20334 grad_norm: 135.2398 loss: 0.6718 2023/06/01 07:50:20 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 07:50:20 - mmengine - INFO - Epoch(train) [20][4600/5758] lr: 1.6094e-05 eta: 0:08:02 time: 0.4077 data_time: 0.0019 memory: 20334 grad_norm: 42.5326 loss: 0.6675 2023/06/01 07:51:00 - mmengine - INFO - Epoch(train) [20][4700/5758] lr: 1.6094e-05 eta: 0:07:20 time: 0.4156 data_time: 0.0026 memory: 20334 grad_norm: 20.7560 loss: 0.6798 2023/06/01 07:51:41 - mmengine - INFO - Epoch(train) [20][4800/5758] lr: 1.6094e-05 eta: 0:06:38 time: 0.3626 data_time: 0.0022 memory: 20334 grad_norm: 12.9377 loss: 0.6682 2023/06/01 07:52:24 - mmengine - INFO - Epoch(train) [20][4900/5758] lr: 1.6094e-05 eta: 0:05:57 time: 0.4467 data_time: 0.0019 memory: 20334 grad_norm: 40.2078 loss: 0.6652 2023/06/01 07:53:05 - mmengine - INFO - Epoch(train) [20][5000/5758] lr: 1.6094e-05 eta: 0:05:15 time: 0.3934 data_time: 0.0022 memory: 20334 grad_norm: 925.7427 loss: 0.6716 2023/06/01 07:53:45 - mmengine - INFO - Epoch(train) [20][5100/5758] lr: 1.6094e-05 eta: 0:04:34 time: 0.3918 data_time: 0.0017 memory: 20334 grad_norm: 31.5509 loss: 0.6629 2023/06/01 07:54:25 - mmengine - INFO - Epoch(train) [20][5200/5758] lr: 1.6094e-05 eta: 0:03:52 time: 0.4472 data_time: 0.0027 memory: 20334 grad_norm: 53.2940 loss: 0.6778 2023/06/01 07:55:06 - mmengine - INFO - Epoch(train) [20][5300/5758] lr: 1.6094e-05 eta: 0:03:10 time: 0.3977 data_time: 0.0023 memory: 20334 grad_norm: 202.7078 loss: 0.6775 2023/06/01 07:55:47 - mmengine - INFO - Epoch(train) [20][5400/5758] lr: 1.6094e-05 eta: 0:02:29 time: 0.4490 data_time: 0.0017 memory: 20334 grad_norm: 117.0472 loss: 0.6648 2023/06/01 07:56:28 - mmengine - INFO - Epoch(train) [20][5500/5758] lr: 1.6094e-05 eta: 0:01:47 time: 0.4497 data_time: 0.0024 memory: 20334 grad_norm: 166.6944 loss: 0.6743 2023/06/01 07:57:08 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 07:57:09 - mmengine - INFO - Epoch(train) [20][5600/5758] lr: 1.6094e-05 eta: 0:01:05 time: 0.4039 data_time: 0.0032 memory: 20334 grad_norm: 60.1827 loss: 0.6722 2023/06/01 07:57:51 - mmengine - INFO - Epoch(train) [20][5700/5758] lr: 1.6094e-05 eta: 0:00:24 time: 0.3753 data_time: 0.0018 memory: 20334 grad_norm: 116.3336 loss: 0.6590 2023/06/01 07:58:14 - mmengine - INFO - Exp name: swin_base_8xb128_fake5m_20230531_183241 2023/06/01 07:58:14 - mmengine - INFO - Saving checkpoint at 20 epochs 2023/06/01 07:58:30 - mmengine - INFO - Epoch(val) [20][8/8] accuracy/top1: 80.5252 single-label/precision_classwise: [100.0, 0.0] single-label/recall_classwise: [80.52522277832031, 0.0] single-label/f1-score_classwise: [89.212158203125, 0.0] data_time: 0.3744 time: 1.0008