Res-VMamba / log_rank0.txt
ms57rd's picture
Upload 3 files
e53b39b verified
[2024-02-22 17:55:19 vssm_small] (main.py 401): INFO Full config saved to ./res_vmamba_cnf241_result_best/vssm_small/default/config.json
[2024-02-22 17:55:19 vssm_small] (main.py 404): INFO AMP_ENABLE: true
AMP_OPT_LEVEL: ''
AUG:
AUTO_AUGMENT: rand-m9-mstd0.5-inc1
COLOR_JITTER: 0.4
CUTMIX: 1.0
CUTMIX_MINMAX: null
MIXUP: 0.8
MIXUP_MODE: batch
MIXUP_PROB: 1.0
MIXUP_SWITCH_PROB: 0.5
RECOUNT: 1
REMODE: pixel
REPROB: 0.25
BASE:
- ''
DATA:
BATCH_SIZE: 128
CACHE_MODE: part
DATASET: imagenet
DATA_PATH: /home/public_3T/food_data/CNFOOD-241
IMG_SIZE: 224
INTERPOLATION: bicubic
MASK_PATCH_SIZE: 32
MASK_RATIO: 0.6
NUM_WORKERS: 8
PIN_MEMORY: true
ZIP_MODE: false
ENABLE_AMP: false
EVAL_MODE: false
FUSED_LAYERNORM: false
FUSED_WINDOW_PROCESS: false
LOCAL_RANK: 0
MODEL:
DROP_PATH_RATE: 0.3
DROP_RATE: 0.0
LABEL_SMOOTHING: 0.1
MMCKPT: false
NAME: vssm_small
NUM_CLASSES: 241
PRETRAINED: ./res_vmamba_cnf241_result_2/vssm_small/default/ckpt_epoch_12.pth
RESUME: ''
TYPE: vssm
VSSM:
DEPTHS:
- 2
- 2
- 27
- 2
DOWNSAMPLE: v1
DT_RANK: auto
D_STATE: 16
EMBED_DIM: 96
IN_CHANS: 3
MLP_RATIO: 0.0
PATCH_NORM: true
PATCH_SIZE: 4
SHARED_SSM: false
SOFTMAX: false
SSM_RATIO: 2.0
OUTPUT: ./res_vmamba_cnf241_result_best/vssm_small/default
PRINT_FREQ: 10
SAVE_FREQ: 1
SEED: 0
TAG: default
TEST:
CROP: true
SEQUENTIAL: false
SHUFFLE: false
THROUGHPUT_MODE: false
TRAIN:
ACCUMULATION_STEPS: 1
AUTO_RESUME: true
BASE_LR: 0.000125
CLIP_GRAD: 5.0
EPOCHS: 300
LAYER_DECAY: 1.0
LR_SCHEDULER:
DECAY_EPOCHS: 30
DECAY_RATE: 0.1
GAMMA: 0.1
MULTISTEPS: []
NAME: cosine
WARMUP_PREFIX: true
MIN_LR: 1.25e-06
MOE:
SAVE_MASTER: false
OPTIMIZER:
BETAS:
- 0.9
- 0.999
EPS: 1.0e-08
MOMENTUM: 0.9
NAME: adamw
START_EPOCH: 0
USE_CHECKPOINT: false
WARMUP_EPOCHS: 20
WARMUP_LR: 1.25e-07
WEIGHT_DECAY: 0.05
[2024-02-22 17:55:19 vssm_small] (main.py 405): INFO {"cfg": "configs/vssm/vssm_small_224.yaml", "opts": null, "batch_size": 128, "data_path": "/home/public_3T/food_data/CNFOOD-241", "zip": false, "cache_mode": "part", "pretrained": "./res_vmamba_cnf241_result_2/vssm_small/default/ckpt_epoch_12.pth", "resume": null, "accumulation_steps": null, "use_checkpoint": false, "disable_amp": false, "amp_opt_level": null, "output": "./res_vmamba_cnf241_result_best", "tag": null, "eval": false, "throughput": false, "local_rank": 0, "fused_layernorm": false, "optim": null, "model_ema": true, "model_ema_decay": 0.9999, "model_ema_force_cpu": false}
[2024-02-22 17:55:20 vssm_small] (main.py 112): INFO Creating model:vssm/vssm_small
[2024-02-22 17:55:20 vssm_small] (main.py 118): INFO VSSM(
(patch_embed): Sequential(
(0): Conv2d(3, 96, kernel_size=(4, 4), stride=(4, 4))
(1): Permute()
(2): LayerNorm((96,), eps=1e-05, elementwise_affine=True)
)
(layers): ModuleList(
(0): Sequential(
(blocks): Sequential(
(0): VSSBlock(
(norm): LayerNorm((96,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=96, out_features=384, bias=False)
(act): SiLU()
(conv2d): Conv2d(192, 192, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=192)
(out_proj): Linear(in_features=192, out_features=96, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.0)
)
(1): VSSBlock(
(norm): LayerNorm((96,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=96, out_features=384, bias=False)
(act): SiLU()
(conv2d): Conv2d(192, 192, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=192)
(out_proj): Linear(in_features=192, out_features=96, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.00937500037252903)
)
)
(downsample): PatchMerging2D(
(reduction): Linear(in_features=384, out_features=192, bias=False)
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
)
)
(1): Sequential(
(blocks): Sequential(
(0): VSSBlock(
(norm): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=192, out_features=768, bias=False)
(act): SiLU()
(conv2d): Conv2d(384, 384, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=384)
(out_proj): Linear(in_features=384, out_features=192, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.01875000074505806)
)
(1): VSSBlock(
(norm): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=192, out_features=768, bias=False)
(act): SiLU()
(conv2d): Conv2d(384, 384, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=384)
(out_proj): Linear(in_features=384, out_features=192, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.02812500111758709)
)
)
(downsample): PatchMerging2D(
(reduction): Linear(in_features=768, out_features=384, bias=False)
(norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
)
)
(2): Sequential(
(blocks): Sequential(
(0): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.03750000149011612)
)
(1): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.046875)
)
(2): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.05625000223517418)
)
(3): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.06562500447034836)
)
(4): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.07500000298023224)
)
(5): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.08437500149011612)
)
(6): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.09375)
)
(7): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.10312500596046448)
)
(8): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.11250000447034836)
)
(9): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.12187500298023224)
)
(10): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.13125000894069672)
)
(11): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.140625)
)
(12): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.15000000596046448)
)
(13): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.15937501192092896)
)
(14): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.16875000298023224)
)
(15): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.17812500894069672)
)
(16): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.1875)
)
(17): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.19687500596046448)
)
(18): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.20625001192092896)
)
(19): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.21562501788139343)
)
(20): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.22500000894069672)
)
(21): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.2343750149011612)
)
(22): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.24375000596046448)
)
(23): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.25312501192092896)
)
(24): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.26250001788139343)
)
(25): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.2718750238418579)
)
(26): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.28125)
)
)
(downsample): PatchMerging2D(
(reduction): Linear(in_features=1536, out_features=768, bias=False)
(norm): LayerNorm((1536,), eps=1e-05, elementwise_affine=True)
)
)
(3): Sequential(
(blocks): Sequential(
(0): VSSBlock(
(norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((1536,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=768, out_features=3072, bias=False)
(act): SiLU()
(conv2d): Conv2d(1536, 1536, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=1536)
(out_proj): Linear(in_features=1536, out_features=768, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.2906250059604645)
)
(1): VSSBlock(
(norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((1536,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=768, out_features=3072, bias=False)
(act): SiLU()
(conv2d): Conv2d(1536, 1536, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=1536)
(out_proj): Linear(in_features=1536, out_features=768, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.30000001192092896)
)
)
(downsample): Identity()
)
)
(classifier): Sequential(
(norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(permute): Permute()
(avgpool): AdaptiveAvgPool2d(output_size=1)
(flatten): Flatten(start_dim=1, end_dim=-1)
(head): Linear(in_features=768, out_features=1000, bias=True)
)
)
[2024-02-22 17:55:20 vssm_small] (main.py 120): INFO number of params: 44417416
[2024-02-22 17:55:22 vssm_small] (main.py 123): INFO number of GFLOPs: 11.231522784
[2024-02-22 17:55:22 vssm_small] (main.py 167): INFO auto resuming from ./res_vmamba_cnf241_result_best/vssm_small/default/ckpt_epoch_166.pth
[2024-02-22 17:55:22 vssm_small] (utils.py 18): INFO ==============> Resuming form ./res_vmamba_cnf241_result_best/vssm_small/default/ckpt_epoch_166.pth....................
[2024-02-22 17:55:23 vssm_small] (utils.py 27): INFO resuming model: <All keys matched successfully>
[2024-02-22 17:55:23 vssm_small] (utils.py 34): INFO resuming model_ema: <All keys matched successfully>
[2024-02-22 17:55:24 vssm_small] (utils.py 48): INFO => loaded successfully './res_vmamba_cnf241_result_best/vssm_small/default/ckpt_epoch_166.pth' (epoch 166)
[2024-02-22 17:55:34 vssm_small] (main.py 324): INFO Test: [0/402] Time 10.582 (10.582) Loss 0.4614 (0.4614) Acc@1 89.844 (89.844) Acc@5 97.656 (97.656) Mem 7155MB
[2024-02-22 17:55:39 vssm_small] (main.py 324): INFO Test: [10/402] Time 0.483 (1.401) Loss 1.1680 (0.9504) Acc@1 76.562 (80.114) Acc@5 92.188 (94.105) Mem 7155MB
[2024-02-22 17:55:44 vssm_small] (main.py 324): INFO Test: [20/402] Time 0.482 (0.964) Loss 1.6357 (0.9252) Acc@1 52.344 (78.720) Acc@5 92.188 (95.164) Mem 7155MB
[2024-02-22 17:55:49 vssm_small] (main.py 324): INFO Test: [30/402] Time 0.483 (0.809) Loss 0.9429 (0.9577) Acc@1 76.562 (77.848) Acc@5 98.438 (95.237) Mem 7155MB
[2024-02-22 17:55:53 vssm_small] (main.py 324): INFO Test: [40/402] Time 0.483 (0.729) Loss 1.0166 (0.9836) Acc@1 72.656 (76.791) Acc@5 96.875 (95.560) Mem 7155MB
[2024-02-22 17:55:58 vssm_small] (main.py 324): INFO Test: [50/402] Time 0.483 (0.681) Loss 0.6353 (0.9501) Acc@1 83.594 (77.788) Acc@5 97.656 (95.527) Mem 7155MB
[2024-02-22 17:56:03 vssm_small] (main.py 324): INFO Test: [60/402] Time 0.482 (0.648) Loss 0.7671 (0.9817) Acc@1 80.469 (77.011) Acc@5 96.875 (95.197) Mem 7155MB
[2024-02-22 17:56:08 vssm_small] (main.py 324): INFO Test: [70/402] Time 0.482 (0.625) Loss 0.8667 (0.9259) Acc@1 77.344 (78.444) Acc@5 96.094 (95.478) Mem 7155MB
[2024-02-22 17:56:13 vssm_small] (main.py 324): INFO Test: [80/402] Time 0.483 (0.607) Loss 0.4990 (0.9246) Acc@1 88.281 (78.511) Acc@5 99.219 (95.515) Mem 7155MB
[2024-02-22 17:56:18 vssm_small] (main.py 324): INFO Test: [90/402] Time 0.483 (0.594) Loss 0.3621 (0.8928) Acc@1 92.188 (79.327) Acc@5 100.000 (95.639) Mem 7155MB
[2024-02-22 17:56:22 vssm_small] (main.py 324): INFO Test: [100/402] Time 0.482 (0.583) Loss 1.1924 (0.9116) Acc@1 78.125 (79.038) Acc@5 90.625 (95.359) Mem 7155MB
[2024-02-22 17:56:27 vssm_small] (main.py 324): INFO Test: [110/402] Time 0.483 (0.574) Loss 0.4204 (0.9173) Acc@1 85.938 (78.899) Acc@5 99.219 (95.341) Mem 7155MB
[2024-02-22 17:56:32 vssm_small] (main.py 324): INFO Test: [120/402] Time 0.483 (0.566) Loss 2.3027 (0.9299) Acc@1 46.094 (78.622) Acc@5 83.594 (95.274) Mem 7155MB
[2024-02-22 17:56:37 vssm_small] (main.py 324): INFO Test: [130/402] Time 0.482 (0.560) Loss 0.4097 (0.9233) Acc@1 91.406 (78.715) Acc@5 99.219 (95.366) Mem 7155MB
[2024-02-22 17:56:42 vssm_small] (main.py 324): INFO Test: [140/402] Time 0.483 (0.554) Loss 2.3652 (0.9762) Acc@1 47.656 (77.543) Acc@5 75.781 (94.891) Mem 7155MB
[2024-02-22 17:56:47 vssm_small] (main.py 324): INFO Test: [150/402] Time 0.482 (0.549) Loss 0.9092 (0.9862) Acc@1 78.906 (77.266) Acc@5 93.750 (94.868) Mem 7155MB
[2024-02-22 17:56:51 vssm_small] (main.py 324): INFO Test: [160/402] Time 0.482 (0.545) Loss 1.4434 (0.9815) Acc@1 61.719 (77.300) Acc@5 93.750 (94.978) Mem 7155MB
[2024-02-22 17:56:56 vssm_small] (main.py 324): INFO Test: [170/402] Time 0.483 (0.542) Loss 0.6587 (0.9758) Acc@1 89.062 (77.421) Acc@5 94.531 (95.006) Mem 7155MB
[2024-02-22 17:57:01 vssm_small] (main.py 324): INFO Test: [180/402] Time 0.482 (0.538) Loss 1.8477 (0.9902) Acc@1 53.125 (77.175) Acc@5 93.750 (94.864) Mem 7155MB
[2024-02-22 17:57:06 vssm_small] (main.py 324): INFO Test: [190/402] Time 0.483 (0.535) Loss 0.4958 (0.9845) Acc@1 89.844 (77.356) Acc@5 96.875 (94.818) Mem 7155MB
[2024-02-22 17:57:11 vssm_small] (main.py 324): INFO Test: [200/402] Time 0.482 (0.533) Loss 0.3074 (0.9707) Acc@1 95.312 (77.697) Acc@5 97.656 (94.916) Mem 7155MB
[2024-02-22 17:57:16 vssm_small] (main.py 324): INFO Test: [210/402] Time 0.483 (0.530) Loss 0.5928 (0.9563) Acc@1 83.594 (78.058) Acc@5 99.219 (95.016) Mem 7155MB
[2024-02-22 17:57:20 vssm_small] (main.py 324): INFO Test: [220/402] Time 0.482 (0.528) Loss 1.1055 (0.9381) Acc@1 76.562 (78.450) Acc@5 92.969 (95.178) Mem 7155MB
[2024-02-22 17:57:25 vssm_small] (main.py 324): INFO Test: [230/402] Time 0.482 (0.526) Loss 2.1230 (0.9502) Acc@1 60.156 (78.436) Acc@5 78.906 (94.913) Mem 7155MB
[2024-02-22 17:57:30 vssm_small] (main.py 324): INFO Test: [240/402] Time 0.483 (0.524) Loss 1.1201 (0.9431) Acc@1 67.188 (78.618) Acc@5 98.438 (94.972) Mem 7155MB
[2024-02-22 17:57:35 vssm_small] (main.py 324): INFO Test: [250/402] Time 0.483 (0.523) Loss 1.8711 (0.9650) Acc@1 53.906 (78.019) Acc@5 95.312 (94.933) Mem 7155MB
[2024-02-22 17:57:40 vssm_small] (main.py 324): INFO Test: [260/402] Time 0.482 (0.521) Loss 0.9282 (0.9637) Acc@1 76.562 (78.023) Acc@5 99.219 (94.983) Mem 7155MB
[2024-02-22 17:57:44 vssm_small] (main.py 324): INFO Test: [270/402] Time 0.483 (0.520) Loss 1.1191 (0.9527) Acc@1 68.750 (78.269) Acc@5 94.531 (95.056) Mem 7155MB
[2024-02-22 17:57:49 vssm_small] (main.py 324): INFO Test: [280/402] Time 0.483 (0.519) Loss 2.3047 (0.9652) Acc@1 19.531 (77.755) Acc@5 92.188 (95.032) Mem 7155MB
[2024-02-22 17:57:54 vssm_small] (main.py 324): INFO Test: [290/402] Time 0.482 (0.517) Loss 0.8774 (0.9767) Acc@1 80.469 (77.489) Acc@5 93.750 (94.912) Mem 7155MB
[2024-02-22 17:57:59 vssm_small] (main.py 324): INFO Test: [300/402] Time 0.483 (0.516) Loss 1.0645 (0.9802) Acc@1 80.469 (77.512) Acc@5 92.188 (94.817) Mem 7155MB
[2024-02-22 17:58:04 vssm_small] (main.py 324): INFO Test: [310/402] Time 0.483 (0.515) Loss 1.0410 (0.9764) Acc@1 79.688 (77.462) Acc@5 96.094 (94.903) Mem 7155MB
[2024-02-22 17:58:09 vssm_small] (main.py 324): INFO Test: [320/402] Time 0.483 (0.514) Loss 1.0859 (0.9653) Acc@1 76.562 (77.743) Acc@5 89.844 (94.960) Mem 7155MB
[2024-02-22 17:58:13 vssm_small] (main.py 324): INFO Test: [330/402] Time 0.483 (0.513) Loss 1.0596 (0.9657) Acc@1 73.438 (77.714) Acc@5 95.312 (95.001) Mem 7155MB
[2024-02-22 17:58:18 vssm_small] (main.py 324): INFO Test: [340/402] Time 0.482 (0.512) Loss 0.3967 (0.9663) Acc@1 90.625 (77.699) Acc@5 100.000 (95.028) Mem 7155MB
[2024-02-22 17:58:23 vssm_small] (main.py 324): INFO Test: [350/402] Time 0.483 (0.511) Loss 1.2148 (0.9637) Acc@1 68.750 (77.773) Acc@5 96.875 (95.050) Mem 7155MB
[2024-02-22 17:58:28 vssm_small] (main.py 324): INFO Test: [360/402] Time 0.483 (0.511) Loss 0.9941 (0.9685) Acc@1 79.688 (77.571) Acc@5 95.312 (95.074) Mem 7155MB
[2024-02-22 17:58:33 vssm_small] (main.py 324): INFO Test: [370/402] Time 0.482 (0.510) Loss 0.9004 (0.9689) Acc@1 83.594 (77.552) Acc@5 94.531 (95.081) Mem 7155MB
[2024-02-22 17:58:38 vssm_small] (main.py 324): INFO Test: [380/402] Time 0.482 (0.509) Loss 0.7358 (0.9634) Acc@1 82.812 (77.690) Acc@5 97.656 (95.114) Mem 7155MB
[2024-02-22 17:58:42 vssm_small] (main.py 324): INFO Test: [390/402] Time 0.482 (0.508) Loss 1.0068 (0.9605) Acc@1 77.344 (77.807) Acc@5 94.531 (95.113) Mem 7155MB
[2024-02-22 17:58:47 vssm_small] (main.py 324): INFO Test: [400/402] Time 0.482 (0.508) Loss 0.2834 (0.9484) Acc@1 95.312 (78.141) Acc@5 99.219 (95.184) Mem 7155MB
[2024-02-22 17:58:48 vssm_small] (main.py 331): INFO * Acc@1 78.150 Acc@5 95.186
[2024-02-22 17:58:48 vssm_small] (main.py 174): INFO Accuracy of the network on the 51354 test images: 78.1%
[2024-02-22 17:58:57 vssm_small] (main.py 324): INFO Test: [0/402] Time 8.919 (8.919) Loss 0.4504 (0.4504) Acc@1 91.406 (91.406) Acc@5 98.438 (98.438) Mem 7155MB
[2024-02-22 17:59:02 vssm_small] (main.py 324): INFO Test: [10/402] Time 0.483 (1.250) Loss 0.9731 (0.8429) Acc@1 83.594 (82.599) Acc@5 91.406 (94.957) Mem 7155MB
[2024-02-22 17:59:07 vssm_small] (main.py 324): INFO Test: [20/402] Time 0.483 (0.884) Loss 1.3154 (0.8200) Acc@1 60.938 (81.510) Acc@5 94.531 (95.573) Mem 7155MB
[2024-02-22 17:59:12 vssm_small] (main.py 324): INFO Test: [30/402] Time 0.483 (0.755) Loss 0.8833 (0.8867) Acc@1 76.562 (79.410) Acc@5 96.875 (95.514) Mem 7155MB
[2024-02-22 17:59:16 vssm_small] (main.py 324): INFO Test: [40/402] Time 0.482 (0.688) Loss 0.8809 (0.8790) Acc@1 74.219 (79.002) Acc@5 98.438 (95.922) Mem 7155MB
[2024-02-22 17:59:21 vssm_small] (main.py 324): INFO Test: [50/402] Time 0.483 (0.648) Loss 0.5254 (0.8631) Acc@1 89.062 (79.611) Acc@5 97.656 (95.956) Mem 7155MB
[2024-02-22 17:59:26 vssm_small] (main.py 324): INFO Test: [60/402] Time 0.483 (0.621) Loss 0.6147 (0.8904) Acc@1 85.156 (78.855) Acc@5 97.656 (95.671) Mem 7155MB
[2024-02-22 17:59:31 vssm_small] (main.py 324): INFO Test: [70/402] Time 0.483 (0.601) Loss 1.0029 (0.8448) Acc@1 77.344 (80.095) Acc@5 96.875 (96.006) Mem 7155MB
[2024-02-22 17:59:36 vssm_small] (main.py 324): INFO Test: [80/402] Time 0.483 (0.587) Loss 0.5259 (0.8449) Acc@1 85.156 (80.102) Acc@5 98.438 (96.007) Mem 7155MB
[2024-02-22 17:59:41 vssm_small] (main.py 324): INFO Test: [90/402] Time 0.483 (0.575) Loss 0.2947 (0.8155) Acc@1 94.531 (80.872) Acc@5 100.000 (96.162) Mem 7155MB
[2024-02-22 17:59:45 vssm_small] (main.py 324): INFO Test: [100/402] Time 0.483 (0.566) Loss 1.2002 (0.8335) Acc@1 76.562 (80.554) Acc@5 92.188 (95.978) Mem 7155MB
[2024-02-22 17:59:50 vssm_small] (main.py 324): INFO Test: [110/402] Time 0.483 (0.559) Loss 0.4329 (0.8417) Acc@1 86.719 (80.342) Acc@5 100.000 (95.967) Mem 7155MB
[2024-02-22 17:59:55 vssm_small] (main.py 324): INFO Test: [120/402] Time 0.483 (0.552) Loss 2.2422 (0.8554) Acc@1 47.656 (80.139) Acc@5 84.375 (95.874) Mem 7155MB
[2024-02-22 18:00:00 vssm_small] (main.py 324): INFO Test: [130/402] Time 0.483 (0.547) Loss 0.4048 (0.8500) Acc@1 90.625 (80.200) Acc@5 99.219 (95.974) Mem 7155MB
[2024-02-22 18:00:05 vssm_small] (main.py 324): INFO Test: [140/402] Time 0.483 (0.543) Loss 2.1191 (0.9016) Acc@1 52.344 (79.039) Acc@5 83.594 (95.495) Mem 7155MB
[2024-02-22 18:00:09 vssm_small] (main.py 324): INFO Test: [150/402] Time 0.483 (0.539) Loss 0.8765 (0.9130) Acc@1 78.906 (78.715) Acc@5 94.531 (95.442) Mem 7155MB
[2024-02-22 18:00:14 vssm_small] (main.py 324): INFO Test: [160/402] Time 0.483 (0.535) Loss 1.3135 (0.9056) Acc@1 67.969 (78.872) Acc@5 95.312 (95.541) Mem 7155MB
[2024-02-22 18:00:19 vssm_small] (main.py 324): INFO Test: [170/402] Time 0.483 (0.532) Loss 0.5923 (0.8953) Acc@1 90.625 (79.094) Acc@5 95.312 (95.591) Mem 7155MB
[2024-02-22 18:00:24 vssm_small] (main.py 324): INFO Test: [180/402] Time 0.483 (0.529) Loss 1.8027 (0.9146) Acc@1 53.125 (78.699) Acc@5 94.531 (95.395) Mem 7155MB
[2024-02-22 18:00:29 vssm_small] (main.py 324): INFO Test: [190/402] Time 0.483 (0.527) Loss 0.4436 (0.9099) Acc@1 91.406 (78.865) Acc@5 97.656 (95.357) Mem 7155MB
[2024-02-22 18:00:34 vssm_small] (main.py 324): INFO Test: [200/402] Time 0.483 (0.525) Loss 0.2937 (0.8963) Acc@1 96.875 (79.190) Acc@5 98.438 (95.464) Mem 7155MB
[2024-02-22 18:00:38 vssm_small] (main.py 324): INFO Test: [210/402] Time 0.483 (0.523) Loss 0.5981 (0.8853) Acc@1 84.375 (79.465) Acc@5 98.438 (95.542) Mem 7155MB
[2024-02-22 18:00:43 vssm_small] (main.py 324): INFO Test: [220/402] Time 0.483 (0.521) Loss 1.0889 (0.8694) Acc@1 77.344 (79.811) Acc@5 92.969 (95.680) Mem 7155MB
[2024-02-22 18:00:48 vssm_small] (main.py 324): INFO Test: [230/402] Time 0.483 (0.519) Loss 1.9727 (0.8842) Acc@1 60.156 (79.708) Acc@5 78.906 (95.394) Mem 7155MB
[2024-02-22 18:00:53 vssm_small] (main.py 324): INFO Test: [240/402] Time 0.482 (0.518) Loss 1.2422 (0.8778) Acc@1 60.156 (79.853) Acc@5 98.438 (95.445) Mem 7155MB
[2024-02-22 18:00:58 vssm_small] (main.py 324): INFO Test: [250/402] Time 0.483 (0.516) Loss 1.4551 (0.8951) Acc@1 60.938 (79.358) Acc@5 95.312 (95.412) Mem 7155MB
[2024-02-22 18:01:03 vssm_small] (main.py 324): INFO Test: [260/402] Time 0.483 (0.515) Loss 0.8667 (0.8933) Acc@1 78.906 (79.331) Acc@5 98.438 (95.489) Mem 7155MB
[2024-02-22 18:01:07 vssm_small] (main.py 324): INFO Test: [270/402] Time 0.482 (0.514) Loss 0.9072 (0.8828) Acc@1 77.344 (79.578) Acc@5 96.094 (95.543) Mem 7155MB
[2024-02-22 18:01:12 vssm_small] (main.py 324): INFO Test: [280/402] Time 0.483 (0.513) Loss 2.3594 (0.8950) Acc@1 19.531 (79.101) Acc@5 92.188 (95.527) Mem 7155MB
[2024-02-22 18:01:17 vssm_small] (main.py 324): INFO Test: [290/402] Time 0.483 (0.512) Loss 0.8384 (0.9058) Acc@1 82.031 (78.845) Acc@5 95.312 (95.455) Mem 7155MB
[2024-02-22 18:01:22 vssm_small] (main.py 324): INFO Test: [300/402] Time 0.482 (0.511) Loss 0.9658 (0.9067) Acc@1 81.250 (78.914) Acc@5 93.750 (95.396) Mem 7155MB
[2024-02-22 18:01:27 vssm_small] (main.py 324): INFO Test: [310/402] Time 0.483 (0.510) Loss 1.0488 (0.9032) Acc@1 81.250 (78.861) Acc@5 96.094 (95.481) Mem 7155MB
[2024-02-22 18:01:32 vssm_small] (main.py 324): INFO Test: [320/402] Time 0.483 (0.509) Loss 0.8892 (0.8902) Acc@1 82.031 (79.193) Acc@5 92.188 (95.536) Mem 7155MB
[2024-02-22 18:01:36 vssm_small] (main.py 324): INFO Test: [330/402] Time 0.483 (0.508) Loss 0.8677 (0.8919) Acc@1 79.688 (79.145) Acc@5 97.656 (95.567) Mem 7155MB
[2024-02-22 18:01:41 vssm_small] (main.py 324): INFO Test: [340/402] Time 0.483 (0.507) Loss 0.3433 (0.8911) Acc@1 89.844 (79.138) Acc@5 100.000 (95.597) Mem 7155MB
[2024-02-22 18:01:46 vssm_small] (main.py 324): INFO Test: [350/402] Time 0.483 (0.507) Loss 0.8315 (0.8892) Acc@1 78.906 (79.200) Acc@5 98.438 (95.620) Mem 7155MB
[2024-02-22 18:01:51 vssm_small] (main.py 324): INFO Test: [360/402] Time 0.483 (0.506) Loss 0.9419 (0.8932) Acc@1 78.125 (79.006) Acc@5 96.094 (95.654) Mem 7155MB
[2024-02-22 18:01:56 vssm_small] (main.py 324): INFO Test: [370/402] Time 0.483 (0.505) Loss 0.8735 (0.8931) Acc@1 82.031 (78.997) Acc@5 94.531 (95.652) Mem 7155MB
[2024-02-22 18:02:01 vssm_small] (main.py 324): INFO Test: [380/402] Time 0.483 (0.505) Loss 0.7627 (0.8883) Acc@1 82.031 (79.138) Acc@5 97.656 (95.682) Mem 7155MB
[2024-02-22 18:02:05 vssm_small] (main.py 324): INFO Test: [390/402] Time 0.483 (0.504) Loss 0.9995 (0.8870) Acc@1 78.125 (79.218) Acc@5 93.750 (95.666) Mem 7155MB
[2024-02-22 18:02:10 vssm_small] (main.py 324): INFO Test: [400/402] Time 0.482 (0.504) Loss 0.2445 (0.8753) Acc@1 96.094 (79.534) Acc@5 99.219 (95.722) Mem 7155MB
[2024-02-22 18:02:11 vssm_small] (main.py 331): INFO * Acc@1 79.544 Acc@5 95.724
[2024-02-22 18:02:11 vssm_small] (main.py 177): INFO Accuracy of the network ema on the 51354 test images: 79.5%
[2024-02-22 18:02:11 vssm_small] (main.py 196): INFO Start training
[2024-02-22 18:02:22 vssm_small] (main.py 274): INFO Train: [167/300][0/933] eta 2:59:28 lr 0.000058 wd 0.0500 time 11.5413 (11.5413) data time 8.5294 (8.5294) loss 3.2837 (3.2837) grad_norm 7.2747 (7.2747) loss_scale 32768.0000 (32768.0000) mem 50097MB
[2024-02-22 18:02:38 vssm_small] (main.py 274): INFO Train: [167/300][10/933] eta 0:38:04 lr 0.000058 wd 0.0500 time 1.5679 (2.4753) data time 0.0005 (0.7759) loss 2.0028 (3.0696) grad_norm 5.6496 (6.8319) loss_scale 32768.0000 (32768.0000) mem 50285MB
[2024-02-22 18:03:56 vssm_small] (main.py 401): INFO Full config saved to ./res_vmamba_cnf241_result_best/vssm_small/default/config.json
[2024-02-22 18:03:56 vssm_small] (main.py 404): INFO AMP_ENABLE: true
AMP_OPT_LEVEL: ''
AUG:
AUTO_AUGMENT: rand-m9-mstd0.5-inc1
COLOR_JITTER: 0.4
CUTMIX: 1.0
CUTMIX_MINMAX: null
MIXUP: 0.8
MIXUP_MODE: batch
MIXUP_PROB: 1.0
MIXUP_SWITCH_PROB: 0.5
RECOUNT: 1
REMODE: pixel
REPROB: 0.25
BASE:
- ''
DATA:
BATCH_SIZE: 128
CACHE_MODE: part
DATASET: imagenet
DATA_PATH: /home/public_3T/food_data/CNFOOD-241
IMG_SIZE: 224
INTERPOLATION: bicubic
MASK_PATCH_SIZE: 32
MASK_RATIO: 0.6
NUM_WORKERS: 8
PIN_MEMORY: true
ZIP_MODE: false
ENABLE_AMP: false
EVAL_MODE: false
FUSED_LAYERNORM: false
FUSED_WINDOW_PROCESS: false
LOCAL_RANK: 0
MODEL:
DROP_PATH_RATE: 0.3
DROP_RATE: 0.0
LABEL_SMOOTHING: 0.1
MMCKPT: false
NAME: vssm_small
NUM_CLASSES: 241
PRETRAINED: ./res_vmamba_cnf241_result_2/vssm_small/default/ckpt_epoch_12.pth
RESUME: ''
TYPE: vssm
VSSM:
DEPTHS:
- 2
- 2
- 27
- 2
DOWNSAMPLE: v1
DT_RANK: auto
D_STATE: 16
EMBED_DIM: 96
IN_CHANS: 3
MLP_RATIO: 0.0
PATCH_NORM: true
PATCH_SIZE: 4
SHARED_SSM: false
SOFTMAX: false
SSM_RATIO: 2.0
OUTPUT: ./res_vmamba_cnf241_result_best/vssm_small/default
PRINT_FREQ: 10
SAVE_FREQ: 1
SEED: 0
TAG: default
TEST:
CROP: true
SEQUENTIAL: false
SHUFFLE: false
THROUGHPUT_MODE: false
TRAIN:
ACCUMULATION_STEPS: 1
AUTO_RESUME: true
BASE_LR: 0.000125
CLIP_GRAD: 5.0
EPOCHS: 300
LAYER_DECAY: 1.0
LR_SCHEDULER:
DECAY_EPOCHS: 30
DECAY_RATE: 0.1
GAMMA: 0.1
MULTISTEPS: []
NAME: cosine
WARMUP_PREFIX: true
MIN_LR: 1.25e-06
MOE:
SAVE_MASTER: false
OPTIMIZER:
BETAS:
- 0.9
- 0.999
EPS: 1.0e-08
MOMENTUM: 0.9
NAME: adamw
START_EPOCH: 0
USE_CHECKPOINT: false
WARMUP_EPOCHS: 20
WARMUP_LR: 1.25e-07
WEIGHT_DECAY: 0.05
[2024-02-22 18:03:56 vssm_small] (main.py 405): INFO {"cfg": "configs/vssm/vssm_small_224.yaml", "opts": null, "batch_size": 128, "data_path": "/home/public_3T/food_data/CNFOOD-241", "zip": false, "cache_mode": "part", "pretrained": "./res_vmamba_cnf241_result_2/vssm_small/default/ckpt_epoch_12.pth", "resume": null, "accumulation_steps": null, "use_checkpoint": false, "disable_amp": false, "amp_opt_level": null, "output": "./res_vmamba_cnf241_result_best", "tag": null, "eval": false, "throughput": false, "local_rank": 0, "fused_layernorm": false, "optim": null, "model_ema": true, "model_ema_decay": 0.9999, "model_ema_force_cpu": false}
[2024-02-22 18:03:56 vssm_small] (main.py 112): INFO Creating model:vssm/vssm_small
[2024-02-22 18:03:57 vssm_small] (main.py 118): INFO VSSM(
(patch_embed): Sequential(
(0): Conv2d(3, 96, kernel_size=(4, 4), stride=(4, 4))
(1): Permute()
(2): LayerNorm((96,), eps=1e-05, elementwise_affine=True)
)
(layers): ModuleList(
(0): Sequential(
(blocks): Sequential(
(0): VSSBlock(
(norm): LayerNorm((96,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=96, out_features=384, bias=False)
(act): SiLU()
(conv2d): Conv2d(192, 192, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=192)
(out_proj): Linear(in_features=192, out_features=96, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.0)
)
(1): VSSBlock(
(norm): LayerNorm((96,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=96, out_features=384, bias=False)
(act): SiLU()
(conv2d): Conv2d(192, 192, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=192)
(out_proj): Linear(in_features=192, out_features=96, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.00937500037252903)
)
)
(downsample): PatchMerging2D(
(reduction): Linear(in_features=384, out_features=192, bias=False)
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
)
)
(1): Sequential(
(blocks): Sequential(
(0): VSSBlock(
(norm): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=192, out_features=768, bias=False)
(act): SiLU()
(conv2d): Conv2d(384, 384, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=384)
(out_proj): Linear(in_features=384, out_features=192, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.01875000074505806)
)
(1): VSSBlock(
(norm): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=192, out_features=768, bias=False)
(act): SiLU()
(conv2d): Conv2d(384, 384, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=384)
(out_proj): Linear(in_features=384, out_features=192, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.02812500111758709)
)
)
(downsample): PatchMerging2D(
(reduction): Linear(in_features=768, out_features=384, bias=False)
(norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
)
)
(2): Sequential(
(blocks): Sequential(
(0): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.03750000149011612)
)
(1): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.046875)
)
(2): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.05625000223517418)
)
(3): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.06562500447034836)
)
(4): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.07500000298023224)
)
(5): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.08437500149011612)
)
(6): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.09375)
)
(7): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.10312500596046448)
)
(8): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.11250000447034836)
)
(9): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.12187500298023224)
)
(10): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.13125000894069672)
)
(11): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.140625)
)
(12): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.15000000596046448)
)
(13): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.15937501192092896)
)
(14): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.16875000298023224)
)
(15): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.17812500894069672)
)
(16): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.1875)
)
(17): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.19687500596046448)
)
(18): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.20625001192092896)
)
(19): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.21562501788139343)
)
(20): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.22500000894069672)
)
(21): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.2343750149011612)
)
(22): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.24375000596046448)
)
(23): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.25312501192092896)
)
(24): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.26250001788139343)
)
(25): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.2718750238418579)
)
(26): VSSBlock(
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=384, out_features=1536, bias=False)
(act): SiLU()
(conv2d): Conv2d(768, 768, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=768)
(out_proj): Linear(in_features=768, out_features=384, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.28125)
)
)
(downsample): PatchMerging2D(
(reduction): Linear(in_features=1536, out_features=768, bias=False)
(norm): LayerNorm((1536,), eps=1e-05, elementwise_affine=True)
)
)
(3): Sequential(
(blocks): Sequential(
(0): VSSBlock(
(norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((1536,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=768, out_features=3072, bias=False)
(act): SiLU()
(conv2d): Conv2d(1536, 1536, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=1536)
(out_proj): Linear(in_features=1536, out_features=768, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.2906250059604645)
)
(1): VSSBlock(
(norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(op): SS2D(
(out_norm): LayerNorm((1536,), eps=1e-05, elementwise_affine=True)
(in_proj): Linear(in_features=768, out_features=3072, bias=False)
(act): SiLU()
(conv2d): Conv2d(1536, 1536, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=1536)
(out_proj): Linear(in_features=1536, out_features=768, bias=False)
(dropout): Identity()
)
(drop_path): timm.DropPath(0.30000001192092896)
)
)
(downsample): Identity()
)
)
(classifier): Sequential(
(norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(permute): Permute()
(avgpool): AdaptiveAvgPool2d(output_size=1)
(flatten): Flatten(start_dim=1, end_dim=-1)
(head): Linear(in_features=768, out_features=1000, bias=True)
)
)
[2024-02-22 18:03:57 vssm_small] (main.py 120): INFO number of params: 44417416
[2024-02-22 18:03:58 vssm_small] (main.py 123): INFO number of GFLOPs: 11.231522784
[2024-02-22 18:03:58 vssm_small] (main.py 167): INFO auto resuming from ./res_vmamba_cnf241_result_best/vssm_small/default/ckpt_epoch_166.pth
[2024-02-22 18:03:58 vssm_small] (utils.py 18): INFO ==============> Resuming form ./res_vmamba_cnf241_result_best/vssm_small/default/ckpt_epoch_166.pth....................
[2024-02-22 18:04:00 vssm_small] (utils.py 27): INFO resuming model: <All keys matched successfully>
[2024-02-22 18:04:00 vssm_small] (utils.py 34): INFO resuming model_ema: <All keys matched successfully>
[2024-02-22 18:04:00 vssm_small] (utils.py 48): INFO => loaded successfully './res_vmamba_cnf241_result_best/vssm_small/default/ckpt_epoch_166.pth' (epoch 166)
[2024-02-22 18:04:11 vssm_small] (main.py 324): INFO Test: [0/164] Time 10.625 (10.625) Loss 0.3557 (0.3557) Acc@1 92.188 (92.188) Acc@5 99.219 (99.219) Mem 7155MB
[2024-02-22 18:04:16 vssm_small] (main.py 324): INFO Test: [10/164] Time 0.483 (1.405) Loss 0.9014 (0.9043) Acc@1 75.000 (78.622) Acc@5 96.094 (95.099) Mem 7155MB
[2024-02-22 18:04:21 vssm_small] (main.py 324): INFO Test: [20/164] Time 0.483 (0.966) Loss 1.3975 (1.0178) Acc@1 73.438 (75.818) Acc@5 89.844 (95.238) Mem 7155MB
[2024-02-22 18:04:25 vssm_small] (main.py 324): INFO Test: [30/164] Time 0.483 (0.810) Loss 0.5830 (0.9851) Acc@1 86.719 (77.344) Acc@5 96.875 (94.960) Mem 7155MB
[2024-02-22 18:04:30 vssm_small] (main.py 324): INFO Test: [40/164] Time 0.483 (0.730) Loss 0.9902 (0.9247) Acc@1 77.344 (78.925) Acc@5 95.312 (95.332) Mem 7155MB
[2024-02-22 18:04:35 vssm_small] (main.py 324): INFO Test: [50/164] Time 0.483 (0.682) Loss 1.3584 (0.9845) Acc@1 69.531 (77.681) Acc@5 92.969 (94.838) Mem 7155MB
[2024-02-22 18:04:40 vssm_small] (main.py 324): INFO Test: [60/164] Time 0.483 (0.649) Loss 1.1582 (1.0527) Acc@1 70.312 (75.999) Acc@5 96.875 (94.237) Mem 7155MB
[2024-02-22 18:04:45 vssm_small] (main.py 324): INFO Test: [70/164] Time 0.483 (0.626) Loss 0.4968 (1.0371) Acc@1 89.844 (76.320) Acc@5 96.875 (94.311) Mem 7155MB
[2024-02-22 18:04:50 vssm_small] (main.py 324): INFO Test: [80/164] Time 0.483 (0.608) Loss 0.4280 (1.0560) Acc@1 91.406 (75.965) Acc@5 98.438 (94.088) Mem 7155MB
[2024-02-22 18:04:54 vssm_small] (main.py 324): INFO Test: [90/164] Time 0.483 (0.594) Loss 1.0479 (1.0186) Acc@1 71.875 (76.829) Acc@5 99.219 (94.420) Mem 7155MB
[2024-02-22 18:04:59 vssm_small] (main.py 324): INFO Test: [100/164] Time 0.483 (0.583) Loss 0.5444 (1.0171) Acc@1 82.812 (77.158) Acc@5 100.000 (94.307) Mem 7155MB
[2024-02-22 18:05:04 vssm_small] (main.py 324): INFO Test: [110/164] Time 0.483 (0.574) Loss 1.3740 (1.0362) Acc@1 67.188 (76.464) Acc@5 96.875 (94.348) Mem 7155MB
[2024-02-22 18:05:09 vssm_small] (main.py 324): INFO Test: [120/164] Time 0.483 (0.567) Loss 2.1602 (1.0386) Acc@1 33.594 (76.220) Acc@5 89.844 (94.441) Mem 7155MB
[2024-02-22 18:05:14 vssm_small] (main.py 324): INFO Test: [130/164] Time 0.483 (0.560) Loss 1.2930 (1.0532) Acc@1 57.812 (75.889) Acc@5 98.438 (94.281) Mem 7155MB
[2024-02-22 18:05:19 vssm_small] (main.py 324): INFO Test: [140/164] Time 0.483 (0.555) Loss 0.7490 (1.0376) Acc@1 81.250 (76.141) Acc@5 96.094 (94.437) Mem 7155MB
[2024-02-22 18:05:23 vssm_small] (main.py 324): INFO Test: [150/164] Time 0.482 (0.550) Loss 1.1650 (1.0309) Acc@1 69.531 (76.293) Acc@5 98.438 (94.521) Mem 7155MB
[2024-02-22 18:05:28 vssm_small] (main.py 324): INFO Test: [160/164] Time 0.483 (0.546) Loss 0.5903 (1.0296) Acc@1 89.844 (76.305) Acc@5 95.312 (94.580) Mem 7155MB
[2024-02-22 18:05:31 vssm_small] (main.py 331): INFO * Acc@1 76.541 Acc@5 94.638
[2024-02-22 18:05:31 vssm_small] (main.py 174): INFO Accuracy of the network on the 20943 test images: 76.5%
[2024-02-22 18:05:39 vssm_small] (main.py 324): INFO Test: [0/164] Time 8.835 (8.835) Loss 0.4526 (0.4526) Acc@1 89.844 (89.844) Acc@5 99.219 (99.219) Mem 7155MB
[2024-02-22 18:05:44 vssm_small] (main.py 324): INFO Test: [10/164] Time 0.482 (1.242) Loss 1.1172 (0.8497) Acc@1 67.969 (79.830) Acc@5 96.094 (95.739) Mem 7155MB
[2024-02-22 18:05:49 vssm_small] (main.py 324): INFO Test: [20/164] Time 0.483 (0.880) Loss 1.3506 (0.9275) Acc@1 72.656 (77.567) Acc@5 92.188 (96.168) Mem 7155MB
[2024-02-22 18:05:54 vssm_small] (main.py 324): INFO Test: [30/164] Time 0.483 (0.752) Loss 0.6631 (0.9005) Acc@1 84.375 (79.133) Acc@5 96.875 (95.640) Mem 7155MB
[2024-02-22 18:05:59 vssm_small] (main.py 324): INFO Test: [40/164] Time 0.483 (0.686) Loss 0.8730 (0.8447) Acc@1 78.906 (80.640) Acc@5 96.094 (95.941) Mem 7155MB
[2024-02-22 18:06:03 vssm_small] (main.py 324): INFO Test: [50/164] Time 0.483 (0.646) Loss 1.4102 (0.9097) Acc@1 66.406 (79.350) Acc@5 92.969 (95.343) Mem 7155MB
[2024-02-22 18:06:08 vssm_small] (main.py 324): INFO Test: [60/164] Time 0.482 (0.620) Loss 1.1191 (0.9768) Acc@1 67.969 (77.818) Acc@5 96.875 (94.762) Mem 7155MB
[2024-02-22 18:06:13 vssm_small] (main.py 324): INFO Test: [70/164] Time 0.482 (0.600) Loss 0.4170 (0.9548) Acc@1 90.625 (78.191) Acc@5 96.094 (94.971) Mem 7155MB
[2024-02-22 18:06:18 vssm_small] (main.py 324): INFO Test: [80/164] Time 0.482 (0.586) Loss 0.4082 (0.9766) Acc@1 91.406 (77.778) Acc@5 98.438 (94.676) Mem 7155MB
[2024-02-22 18:06:23 vssm_small] (main.py 324): INFO Test: [90/164] Time 0.482 (0.574) Loss 1.0576 (0.9459) Acc@1 72.656 (78.546) Acc@5 99.219 (94.943) Mem 7155MB
[2024-02-22 18:06:28 vssm_small] (main.py 324): INFO Test: [100/164] Time 0.483 (0.565) Loss 0.5508 (0.9468) Acc@1 84.375 (78.860) Acc@5 100.000 (94.825) Mem 7155MB
[2024-02-22 18:06:32 vssm_small] (main.py 324): INFO Test: [110/164] Time 0.483 (0.558) Loss 1.1367 (0.9615) Acc@1 68.750 (78.202) Acc@5 97.656 (94.869) Mem 7155MB
[2024-02-22 18:06:37 vssm_small] (main.py 324): INFO Test: [120/164] Time 0.482 (0.552) Loss 2.1855 (0.9641) Acc@1 29.688 (77.893) Acc@5 91.406 (94.990) Mem 7155MB
[2024-02-22 18:06:42 vssm_small] (main.py 324): INFO Test: [130/164] Time 0.483 (0.546) Loss 1.2090 (0.9734) Acc@1 60.156 (77.642) Acc@5 99.219 (94.931) Mem 7155MB
[2024-02-22 18:06:47 vssm_small] (main.py 324): INFO Test: [140/164] Time 0.483 (0.542) Loss 0.6606 (0.9576) Acc@1 82.812 (77.876) Acc@5 99.219 (95.107) Mem 7155MB
[2024-02-22 18:06:52 vssm_small] (main.py 324): INFO Test: [150/164] Time 0.482 (0.538) Loss 0.9053 (0.9501) Acc@1 77.344 (78.084) Acc@5 98.438 (95.183) Mem 7155MB
[2024-02-22 18:06:57 vssm_small] (main.py 324): INFO Test: [160/164] Time 0.482 (0.534) Loss 0.5884 (0.9481) Acc@1 86.719 (78.023) Acc@5 96.875 (95.259) Mem 7155MB
[2024-02-22 18:06:58 vssm_small] (main.py 331): INFO * Acc@1 78.260 Acc@5 95.306
[2024-02-22 18:06:58 vssm_small] (main.py 177): INFO Accuracy of the network ema on the 20943 test images: 78.3%
[2024-02-22 18:06:58 vssm_small] (main.py 196): INFO Start training
[2024-02-22 18:07:10 vssm_small] (main.py 274): INFO Train: [167/300][0/933] eta 3:02:27 lr 0.000058 wd 0.0500 time 11.7339 (11.7339) data time 8.6883 (8.6883) loss 3.2837 (3.2837) grad_norm 7.2753 (7.2753) loss_scale 32768.0000 (32768.0000) mem 50097MB
[2024-02-22 18:07:26 vssm_small] (main.py 274): INFO Train: [167/300][10/933] eta 0:38:21 lr 0.000058 wd 0.0500 time 1.5683 (2.4935) data time 0.0006 (0.7903) loss 2.0028 (3.0696) grad_norm 5.6539 (6.8326) loss_scale 32768.0000 (32768.0000) mem 50285MB
[2024-02-22 18:07:41 vssm_small] (main.py 274): INFO Train: [167/300][20/933] eta 0:31:15 lr 0.000058 wd 0.0500 time 1.5680 (2.0546) data time 0.0006 (0.4143) loss 2.7450 (2.9151) grad_norm 4.8270 (6.2856) loss_scale 32768.0000 (32768.0000) mem 50285MB