vall-e_libritts / libritts /log /log-train-2024-08-06-03-39-40-5
yuekai's picture
Upload folder using huggingface_hub
c96c265 verified
raw
history blame contribute delete
No virus
71.5 kB
2024-08-06 03:39:40,357 INFO [trainer.py:870] (5/8) Training started
2024-08-06 03:39:40,358 INFO [trainer.py:889] (5/8) Device: cuda:5
2024-08-06 03:39:40,358 INFO [trainer.py:890] (5/8) {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 100, 'reset_interval': 200, 'valid_interval': 2000, 'env_info': {'k2-version': '1.24.3', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '279b0c87015a615b81b147251814d737a548f397', 'k2-git-date': 'Wed May 24 22:24:09 2023', 'lhotse-version': '1.26.0', 'torch-version': '2.0.1+cu118', 'torch-cuda-available': True, 'torch-cuda-version': '11.8', 'python-version': '3.10', 'icefall-git-branch': 'main', 'icefall-git-sha1': '7d2e5f4-dirty', 'icefall-git-date': 'Tue Aug 6 02:59:12 2024', 'icefall-path': '/workspace/icefall_llm', 'k2-path': '/usr/local/lib/python3.10/dist-packages/k2/__init__.py', 'lhotse-path': '/usr/local/lib/python3.10/dist-packages/lhotse/__init__.py', 'hostname': '6865771', 'IP address': '0.104.195.107'}, 'world_size': 8, 'master_port': 12354, 'tensorboard': True, 'num_epochs': 20, 'start_epoch': 1, 'start_batch': 0, 'exp_dir': PosixPath('exp/valle'), 'optimizer_name': 'ScaledAdam', 'scheduler_name': 'Eden', 'base_lr': 0.03, 'warmup_steps': 200, 'seed': 42, 'inf_check': False, 'save_every_n': 1000, 'keep_last_k': 20, 'average_period': 0, 'accumulate_grad_steps': 1, 'dtype': 'bfloat16', 'filter_min_duration': 0.5, 'filter_max_duration': 14.0, 'train_stage': 1, 'visualize': False, 'oom_check': False, 'model_name': 'valle', 'decoder_dim': 1024, 'nhead': 16, 'num_decoder_layers': 12, 'scale_factor': 1.0, 'norm_first': True, 'add_prenet': False, 'prefix_mode': 1, 'share_embedding': True, 'prepend_bos': False, 'num_quantizers': 8, 'scaling_xformers': False, 'manifest_dir': PosixPath('data/tokenized'), 'max_duration': 320, 'bucketing_sampler': True, 'num_buckets': 6, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 0.1, 'on_the_fly_feats': False, 'shuffle': True, 'buffer_size': 40000, 'shuffle_buffer_size': 100000, 'drop_last': False, 'return_cuts': True, 'num_workers': 8, 'enable_spec_aug': False, 'spec_aug_time_warp_factor': 80, 'input_strategy': 'PrecomputedFeatures', 'dataset': 'libritts', 'text_tokens': 'data/tokenized/unique_text_tokens.k2symbols', 'sampling_rate': 24000}
2024-08-06 03:39:40,358 INFO [trainer.py:892] (5/8) About to create model
2024-08-06 03:39:41,131 INFO [trainer.py:899] (5/8) Number of model parameters: 367386628
2024-08-06 03:39:41,936 INFO [trainer.py:914] (5/8) Using DDP
2024-08-06 03:39:43,994 INFO [datamodule.py:427] (5/8) About to get train cuts
2024-08-06 03:39:43,996 INFO [datamodule.py:434] (5/8) About to get dev cuts
2024-08-06 03:39:43,997 INFO [datamodule.py:292] (5/8) Disable SpecAugment
2024-08-06 03:39:43,997 INFO [datamodule.py:294] (5/8) About to create train dataset
2024-08-06 03:39:43,998 INFO [datamodule.py:323] (5/8) Using DynamicBucketingSampler
2024-08-06 03:39:44,608 INFO [datamodule.py:344] (5/8) About to create train dataloader
2024-08-06 03:39:44,608 INFO [datamodule.py:367] (5/8) About to create dev dataset
2024-08-06 03:39:44,938 INFO [datamodule.py:388] (5/8) About to create dev dataloader
2024-08-06 03:40:39,571 INFO [trainer.py:765] (5/8) Epoch 1, batch 100, train_loss[loss=4.067, ArTop10Accuracy=0.5153, over 14698.00 frames. ], tot_loss[loss=4.768, ArTop10Accuracy=0.3985, over 4803.94 frames. ], batch size: 61, lr: 2.25e-02
2024-08-06 03:41:16,923 INFO [trainer.py:765] (5/8) Epoch 1, batch 200, train_loss[loss=3.914, ArTop10Accuracy=0.5413, over 13982.00 frames. ], tot_loss[loss=4.296, ArTop10Accuracy=0.4768, over 7819.22 frames. ], batch size: 34, lr: 3.00e-02
2024-08-06 03:41:57,951 INFO [trainer.py:765] (5/8) Epoch 1, batch 300, train_loss[loss=3.74, ArTop10Accuracy=0.5657, over 14595.00 frames. ], tot_loss[loss=4.092, ArTop10Accuracy=0.51, over 9441.11 frames. ], batch size: 44, lr: 3.00e-02
2024-08-06 03:42:33,080 INFO [trainer.py:765] (5/8) Epoch 1, batch 400, train_loss[loss=3.832, ArTop10Accuracy=0.544, over 11085.00 frames. ], tot_loss[loss=3.943, ArTop10Accuracy=0.5348, over 10345.48 frames. ], batch size: 15, lr: 3.00e-02
2024-08-06 03:43:11,272 INFO [trainer.py:765] (5/8) Epoch 1, batch 500, train_loss[loss=3.575, ArTop10Accuracy=0.6015, over 12279.00 frames. ], tot_loss[loss=3.827, ArTop10Accuracy=0.554, over 10905.73 frames. ], batch size: 22, lr: 2.99e-02
2024-08-06 03:43:46,593 INFO [trainer.py:765] (5/8) Epoch 1, batch 600, train_loss[loss=3.53, ArTop10Accuracy=0.6034, over 11551.00 frames. ], tot_loss[loss=3.743, ArTop10Accuracy=0.5684, over 11432.17 frames. ], batch size: 18, lr: 2.99e-02
2024-08-06 03:44:27,899 INFO [trainer.py:765] (5/8) Epoch 1, batch 700, train_loss[loss=3.449, ArTop10Accuracy=0.6251, over 10225.00 frames. ], tot_loss[loss=3.684, ArTop10Accuracy=0.5786, over 11587.85 frames. ], batch size: 12, lr: 2.99e-02
2024-08-06 03:45:01,514 INFO [trainer.py:765] (5/8) Epoch 1, batch 800, train_loss[loss=3.53, ArTop10Accuracy=0.6113, over 10199.00 frames. ], tot_loss[loss=3.637, ArTop10Accuracy=0.5872, over 11703.53 frames. ], batch size: 12, lr: 2.98e-02
2024-08-06 03:45:32,556 INFO [trainer.py:765] (5/8) Epoch 1, batch 900, train_loss[loss=3.565, ArTop10Accuracy=0.5989, over 12863.00 frames. ], tot_loss[loss=3.592, ArTop10Accuracy=0.5956, over 11754.75 frames. ], batch size: 27, lr: 2.98e-02
2024-08-06 03:46:03,648 INFO [trainer.py:765] (5/8) Epoch 1, batch 1000, train_loss[loss=3.418, ArTop10Accuracy=0.6271, over 13368.00 frames. ], tot_loss[loss=3.562, ArTop10Accuracy=0.6009, over 11949.64 frames. ], batch size: 28, lr: 2.97e-02
2024-08-06 03:46:07,989 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 8.169e+01 1.565e+02 2.239e+02 3.485e+02 9.105e+03, threshold=4.478e+02, percent-clipped=0.0
2024-08-06 03:46:38,611 INFO [trainer.py:765] (5/8) Epoch 1, batch 1100, train_loss[loss=3.506, ArTop10Accuracy=0.6104, over 13606.00 frames. ], tot_loss[loss=3.532, ArTop10Accuracy=0.6068, over 12001.98 frames. ], batch size: 34, lr: 2.96e-02
2024-08-06 03:47:08,744 INFO [trainer.py:765] (5/8) Epoch 1, batch 1200, train_loss[loss=3.554, ArTop10Accuracy=0.6072, over 12768.00 frames. ], tot_loss[loss=3.504, ArTop10Accuracy=0.6124, over 11966.10 frames. ], batch size: 99, lr: 2.96e-02
2024-08-06 03:47:33,831 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 03:48:38,676 INFO [trainer.py:765] (5/8) Epoch 2, batch 100, train_loss[loss=3.509, ArTop10Accuracy=0.6145, over 15383.00 frames. ], tot_loss[loss=3.462, ArTop10Accuracy=0.6208, over 4782.40 frames. ], batch size: 62, lr: 2.90e-02
2024-08-06 03:49:14,597 INFO [trainer.py:765] (5/8) Epoch 2, batch 200, train_loss[loss=3.493, ArTop10Accuracy=0.6161, over 13591.00 frames. ], tot_loss[loss=3.439, ArTop10Accuracy=0.6244, over 7778.19 frames. ], batch size: 34, lr: 2.89e-02
2024-08-06 03:49:56,519 INFO [trainer.py:765] (5/8) Epoch 2, batch 300, train_loss[loss=3.494, ArTop10Accuracy=0.6171, over 14088.00 frames. ], tot_loss[loss=3.421, ArTop10Accuracy=0.6282, over 9400.23 frames. ], batch size: 44, lr: 2.89e-02
2024-08-06 03:50:31,999 INFO [trainer.py:765] (5/8) Epoch 2, batch 400, train_loss[loss=3.269, ArTop10Accuracy=0.6539, over 10404.00 frames. ], tot_loss[loss=3.42, ArTop10Accuracy=0.6282, over 10326.14 frames. ], batch size: 14, lr: 2.88e-02
2024-08-06 03:51:17,109 INFO [trainer.py:765] (5/8) Epoch 2, batch 500, train_loss[loss=3.403, ArTop10Accuracy=0.6296, over 12285.00 frames. ], tot_loss[loss=3.405, ArTop10Accuracy=0.631, over 10906.93 frames. ], batch size: 22, lr: 2.87e-02
2024-08-06 03:51:53,202 INFO [trainer.py:765] (5/8) Epoch 2, batch 600, train_loss[loss=3.382, ArTop10Accuracy=0.6347, over 11612.00 frames. ], tot_loss[loss=3.403, ArTop10Accuracy=0.6314, over 11426.21 frames. ], batch size: 18, lr: 2.86e-02
2024-08-06 03:52:38,993 INFO [trainer.py:765] (5/8) Epoch 2, batch 700, train_loss[loss=3.322, ArTop10Accuracy=0.6408, over 10116.00 frames. ], tot_loss[loss=3.396, ArTop10Accuracy=0.633, over 11590.48 frames. ], batch size: 12, lr: 2.85e-02
2024-08-06 03:52:47,090 INFO [trainer.py:803] (5/8) Computing validation loss
2024-08-06 03:52:56,023 INFO [trainer.py:811] (5/8) Epoch 2, validation: loss=3.327, ArTop10Accuracy=0.6492, over 1829298.00 frames.
2024-08-06 03:52:56,024 INFO [trainer.py:814] (5/8) Maximum memory allocated so far is 33247MB
2024-08-06 03:52:56,541 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 8.181e+01 1.431e+02 1.849e+02 2.730e+02 2.344e+03, threshold=3.697e+02, percent-clipped=7.2
2024-08-06 03:53:21,881 INFO [trainer.py:765] (5/8) Epoch 2, batch 800, train_loss[loss=3.297, ArTop10Accuracy=0.6453, over 9991.00 frames. ], tot_loss[loss=3.384, ArTop10Accuracy=0.6354, over 11690.37 frames. ], batch size: 12, lr: 2.84e-02
2024-08-06 03:53:53,299 INFO [trainer.py:765] (5/8) Epoch 2, batch 900, train_loss[loss=3.489, ArTop10Accuracy=0.6104, over 13175.00 frames. ], tot_loss[loss=3.371, ArTop10Accuracy=0.6379, over 11741.36 frames. ], batch size: 27, lr: 2.83e-02
2024-08-06 03:54:24,809 INFO [trainer.py:765] (5/8) Epoch 2, batch 1000, train_loss[loss=3.344, ArTop10Accuracy=0.6403, over 13207.00 frames. ], tot_loss[loss=3.37, ArTop10Accuracy=0.6383, over 11955.27 frames. ], batch size: 28, lr: 2.82e-02
2024-08-06 03:54:56,006 INFO [trainer.py:765] (5/8) Epoch 2, batch 1100, train_loss[loss=3.354, ArTop10Accuracy=0.6401, over 13620.00 frames. ], tot_loss[loss=3.363, ArTop10Accuracy=0.6398, over 12005.23 frames. ], batch size: 34, lr: 2.81e-02
2024-08-06 03:55:26,229 INFO [trainer.py:765] (5/8) Epoch 2, batch 1200, train_loss[loss=3.358, ArTop10Accuracy=0.6407, over 12633.00 frames. ], tot_loss[loss=3.351, ArTop10Accuracy=0.6421, over 11951.95 frames. ], batch size: 99, lr: 2.80e-02
2024-08-06 03:55:51,006 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 03:57:04,102 INFO [trainer.py:765] (5/8) Epoch 3, batch 100, train_loss[loss=3.372, ArTop10Accuracy=0.637, over 14379.00 frames. ], tot_loss[loss=3.305, ArTop10Accuracy=0.6514, over 4797.26 frames. ], batch size: 61, lr: 2.67e-02
2024-08-06 03:57:50,980 INFO [trainer.py:765] (5/8) Epoch 3, batch 200, train_loss[loss=3.295, ArTop10Accuracy=0.6549, over 13611.00 frames. ], tot_loss[loss=3.296, ArTop10Accuracy=0.6531, over 7794.13 frames. ], batch size: 34, lr: 2.66e-02
2024-08-06 03:58:26,074 INFO [trainer.py:765] (5/8) Epoch 3, batch 300, train_loss[loss=3.283, ArTop10Accuracy=0.6571, over 14285.00 frames. ], tot_loss[loss=3.277, ArTop10Accuracy=0.6565, over 9418.33 frames. ], batch size: 44, lr: 2.64e-02
2024-08-06 03:59:11,254 INFO [trainer.py:765] (5/8) Epoch 3, batch 400, train_loss[loss=3.251, ArTop10Accuracy=0.6608, over 10088.00 frames. ], tot_loss[loss=3.266, ArTop10Accuracy=0.659, over 10326.14 frames. ], batch size: 14, lr: 2.63e-02
2024-08-06 03:59:29,675 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 8.720e+01 1.461e+02 1.775e+02 2.344e+02 9.150e+02, threshold=3.550e+02, percent-clipped=5.2
2024-08-06 03:59:49,303 INFO [trainer.py:765] (5/8) Epoch 3, batch 500, train_loss[loss=3.14, ArTop10Accuracy=0.6808, over 12417.00 frames. ], tot_loss[loss=3.251, ArTop10Accuracy=0.6616, over 10905.55 frames. ], batch size: 22, lr: 2.62e-02
2024-08-06 04:00:35,095 INFO [trainer.py:765] (5/8) Epoch 3, batch 600, train_loss[loss=3.259, ArTop10Accuracy=0.654, over 11574.00 frames. ], tot_loss[loss=3.239, ArTop10Accuracy=0.6639, over 11441.83 frames. ], batch size: 18, lr: 2.61e-02
2024-08-06 04:01:22,059 INFO [trainer.py:765] (5/8) Epoch 3, batch 700, train_loss[loss=3.175, ArTop10Accuracy=0.6775, over 9982.00 frames. ], tot_loss[loss=3.231, ArTop10Accuracy=0.6655, over 11581.37 frames. ], batch size: 12, lr: 2.60e-02
2024-08-06 04:01:56,269 INFO [trainer.py:765] (5/8) Epoch 3, batch 800, train_loss[loss=3.022, ArTop10Accuracy=0.7074, over 10282.00 frames. ], tot_loss[loss=3.225, ArTop10Accuracy=0.6669, over 11692.55 frames. ], batch size: 12, lr: 2.59e-02
2024-08-06 04:02:27,740 INFO [trainer.py:765] (5/8) Epoch 3, batch 900, train_loss[loss=3.11, ArTop10Accuracy=0.6884, over 12949.00 frames. ], tot_loss[loss=3.209, ArTop10Accuracy=0.6702, over 11726.02 frames. ], batch size: 27, lr: 2.57e-02
2024-08-06 04:02:59,284 INFO [trainer.py:765] (5/8) Epoch 3, batch 1000, train_loss[loss=3.151, ArTop10Accuracy=0.6834, over 12768.00 frames. ], tot_loss[loss=3.195, ArTop10Accuracy=0.6728, over 11939.30 frames. ], batch size: 27, lr: 2.56e-02
2024-08-06 04:03:30,942 INFO [trainer.py:765] (5/8) Epoch 3, batch 1100, train_loss[loss=3.21, ArTop10Accuracy=0.6757, over 13656.00 frames. ], tot_loss[loss=3.19, ArTop10Accuracy=0.6739, over 11995.70 frames. ], batch size: 34, lr: 2.55e-02
2024-08-06 04:04:01,313 INFO [trainer.py:765] (5/8) Epoch 3, batch 1200, train_loss[loss=3.201, ArTop10Accuracy=0.6668, over 11681.00 frames. ], tot_loss[loss=3.182, ArTop10Accuracy=0.6754, over 11931.92 frames. ], batch size: 99, lr: 2.54e-02
2024-08-06 04:04:26,820 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 04:05:43,368 INFO [trainer.py:765] (5/8) Epoch 4, batch 100, train_loss[loss=3.204, ArTop10Accuracy=0.6698, over 14500.00 frames. ], tot_loss[loss=3.138, ArTop10Accuracy=0.6847, over 4766.81 frames. ], batch size: 61, lr: 2.38e-02
2024-08-06 04:06:07,077 INFO [trainer.py:803] (5/8) Computing validation loss
2024-08-06 04:06:16,404 INFO [trainer.py:811] (5/8) Epoch 4, validation: loss=3.063, ArTop10Accuracy=0.7031, over 1829298.00 frames.
2024-08-06 04:06:16,405 INFO [trainer.py:814] (5/8) Maximum memory allocated so far is 33247MB
2024-08-06 04:06:16,746 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.091e+02 1.493e+02 1.709e+02 2.068e+02 7.969e+02, threshold=3.418e+02, percent-clipped=2.9
2024-08-06 04:06:31,826 INFO [trainer.py:765] (5/8) Epoch 4, batch 200, train_loss[loss=3.068, ArTop10Accuracy=0.7023, over 13915.00 frames. ], tot_loss[loss=3.12, ArTop10Accuracy=0.6881, over 7780.21 frames. ], batch size: 34, lr: 2.37e-02
2024-08-06 04:07:18,545 INFO [trainer.py:765] (5/8) Epoch 4, batch 300, train_loss[loss=3.182, ArTop10Accuracy=0.6698, over 14364.00 frames. ], tot_loss[loss=3.116, ArTop10Accuracy=0.6888, over 9416.98 frames. ], batch size: 44, lr: 2.36e-02
2024-08-06 04:08:01,910 INFO [trainer.py:765] (5/8) Epoch 4, batch 400, train_loss[loss=3.235, ArTop10Accuracy=0.6675, over 10712.00 frames. ], tot_loss[loss=3.112, ArTop10Accuracy=0.6896, over 10331.30 frames. ], batch size: 15, lr: 2.34e-02
2024-08-06 04:08:45,345 INFO [trainer.py:765] (5/8) Epoch 4, batch 500, train_loss[loss=3.071, ArTop10Accuracy=0.6985, over 12318.00 frames. ], tot_loss[loss=3.103, ArTop10Accuracy=0.6908, over 10905.47 frames. ], batch size: 22, lr: 2.33e-02
2024-08-06 04:09:37,072 INFO [trainer.py:765] (5/8) Epoch 4, batch 600, train_loss[loss=2.998, ArTop10Accuracy=0.7101, over 11544.00 frames. ], tot_loss[loss=3.106, ArTop10Accuracy=0.6899, over 11446.33 frames. ], batch size: 18, lr: 2.32e-02
2024-08-06 04:10:13,502 INFO [trainer.py:765] (5/8) Epoch 4, batch 700, train_loss[loss=3.127, ArTop10Accuracy=0.6854, over 10712.00 frames. ], tot_loss[loss=3.109, ArTop10Accuracy=0.6895, over 11597.86 frames. ], batch size: 13, lr: 2.31e-02
2024-08-06 04:10:51,960 INFO [trainer.py:765] (5/8) Epoch 4, batch 800, train_loss[loss=3.118, ArTop10Accuracy=0.6844, over 10125.00 frames. ], tot_loss[loss=3.112, ArTop10Accuracy=0.6893, over 11711.35 frames. ], batch size: 12, lr: 2.30e-02
2024-08-06 04:11:23,331 INFO [trainer.py:765] (5/8) Epoch 4, batch 900, train_loss[loss=3.065, ArTop10Accuracy=0.6956, over 12823.00 frames. ], tot_loss[loss=3.101, ArTop10Accuracy=0.6914, over 11744.91 frames. ], batch size: 27, lr: 2.29e-02
2024-08-06 04:11:54,826 INFO [trainer.py:765] (5/8) Epoch 4, batch 1000, train_loss[loss=3.113, ArTop10Accuracy=0.695, over 12969.00 frames. ], tot_loss[loss=3.108, ArTop10Accuracy=0.6902, over 11929.79 frames. ], batch size: 27, lr: 2.28e-02
2024-08-06 04:12:25,960 INFO [trainer.py:765] (5/8) Epoch 4, batch 1100, train_loss[loss=3.165, ArTop10Accuracy=0.6787, over 13719.00 frames. ], tot_loss[loss=3.106, ArTop10Accuracy=0.6906, over 11981.14 frames. ], batch size: 34, lr: 2.26e-02
2024-08-06 04:12:48,544 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.106e+02 1.440e+02 1.608e+02 1.893e+02 7.925e+02, threshold=3.216e+02, percent-clipped=2.0
2024-08-06 04:12:58,828 INFO [trainer.py:765] (5/8) Epoch 4, batch 1200, train_loss[loss=3.153, ArTop10Accuracy=0.6798, over 13163.00 frames. ], tot_loss[loss=3.105, ArTop10Accuracy=0.691, over 11930.69 frames. ], batch size: 99, lr: 2.25e-02
2024-08-06 04:13:24,395 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 04:14:38,685 INFO [trainer.py:765] (5/8) Epoch 5, batch 100, train_loss[loss=3.103, ArTop10Accuracy=0.6941, over 14277.00 frames. ], tot_loss[loss=3.071, ArTop10Accuracy=0.6987, over 4775.62 frames. ], batch size: 61, lr: 2.10e-02
2024-08-06 04:15:26,826 INFO [trainer.py:765] (5/8) Epoch 5, batch 200, train_loss[loss=3.121, ArTop10Accuracy=0.6934, over 13238.00 frames. ], tot_loss[loss=3.062, ArTop10Accuracy=0.6999, over 7787.15 frames. ], batch size: 33, lr: 2.09e-02
2024-08-06 04:16:08,011 INFO [trainer.py:765] (5/8) Epoch 5, batch 300, train_loss[loss=3.047, ArTop10Accuracy=0.6996, over 14418.00 frames. ], tot_loss[loss=3.052, ArTop10Accuracy=0.7016, over 9417.09 frames. ], batch size: 45, lr: 2.08e-02
2024-08-06 04:16:53,133 INFO [trainer.py:765] (5/8) Epoch 5, batch 400, train_loss[loss=2.983, ArTop10Accuracy=0.718, over 10772.00 frames. ], tot_loss[loss=3.051, ArTop10Accuracy=0.7018, over 10324.21 frames. ], batch size: 15, lr: 2.07e-02
2024-08-06 04:17:36,638 INFO [trainer.py:765] (5/8) Epoch 5, batch 500, train_loss[loss=3.097, ArTop10Accuracy=0.699, over 12047.00 frames. ], tot_loss[loss=3.052, ArTop10Accuracy=0.7012, over 10893.96 frames. ], batch size: 22, lr: 2.06e-02
2024-08-06 04:18:22,114 INFO [trainer.py:765] (5/8) Epoch 5, batch 600, train_loss[loss=2.971, ArTop10Accuracy=0.7173, over 11561.00 frames. ], tot_loss[loss=3.049, ArTop10Accuracy=0.7017, over 11426.06 frames. ], batch size: 18, lr: 2.05e-02
2024-08-06 04:19:17,032 INFO [trainer.py:765] (5/8) Epoch 5, batch 700, train_loss[loss=2.911, ArTop10Accuracy=0.7209, over 10102.00 frames. ], tot_loss[loss=3.06, ArTop10Accuracy=0.6995, over 11566.59 frames. ], batch size: 12, lr: 2.04e-02
2024-08-06 04:19:51,066 INFO [trainer.py:765] (5/8) Epoch 5, batch 800, train_loss[loss=3.042, ArTop10Accuracy=0.7093, over 9885.00 frames. ], tot_loss[loss=3.061, ArTop10Accuracy=0.6991, over 11673.70 frames. ], batch size: 12, lr: 2.03e-02
2024-08-06 04:20:18,214 INFO [trainer.py:803] (5/8) Computing validation loss
2024-08-06 04:20:27,475 INFO [trainer.py:811] (5/8) Epoch 5, validation: loss=2.998, ArTop10Accuracy=0.7157, over 1829298.00 frames.
2024-08-06 04:20:27,476 INFO [trainer.py:814] (5/8) Maximum memory allocated so far is 33247MB
2024-08-06 04:20:27,781 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.057e+02 1.385e+02 1.542e+02 1.759e+02 7.741e+02, threshold=3.083e+02, percent-clipped=0.7
2024-08-06 04:20:31,767 INFO [trainer.py:765] (5/8) Epoch 5, batch 900, train_loss[loss=3.013, ArTop10Accuracy=0.7121, over 13101.00 frames. ], tot_loss[loss=3.055, ArTop10Accuracy=0.7004, over 11734.70 frames. ], batch size: 27, lr: 2.02e-02
2024-08-06 04:21:03,305 INFO [trainer.py:765] (5/8) Epoch 5, batch 1000, train_loss[loss=3.14, ArTop10Accuracy=0.6786, over 12754.00 frames. ], tot_loss[loss=3.062, ArTop10Accuracy=0.6993, over 11949.56 frames. ], batch size: 27, lr: 2.01e-02
2024-08-06 04:21:34,452 INFO [trainer.py:765] (5/8) Epoch 5, batch 1100, train_loss[loss=3.189, ArTop10Accuracy=0.6789, over 13715.00 frames. ], tot_loss[loss=3.066, ArTop10Accuracy=0.6984, over 11984.84 frames. ], batch size: 34, lr: 2.00e-02
2024-08-06 04:22:04,752 INFO [trainer.py:765] (5/8) Epoch 5, batch 1200, train_loss[loss=3.185, ArTop10Accuracy=0.6753, over 11743.00 frames. ], tot_loss[loss=3.063, ArTop10Accuracy=0.6992, over 11930.58 frames. ], batch size: 97, lr: 1.99e-02
2024-08-06 04:22:30,423 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 04:23:46,282 INFO [trainer.py:765] (5/8) Epoch 6, batch 100, train_loss[loss=3.06, ArTop10Accuracy=0.7028, over 14806.00 frames. ], tot_loss[loss=3.029, ArTop10Accuracy=0.7069, over 4774.08 frames. ], batch size: 61, lr: 1.85e-02
2024-08-06 04:24:35,255 INFO [trainer.py:765] (5/8) Epoch 6, batch 200, train_loss[loss=3.022, ArTop10Accuracy=0.705, over 13776.00 frames. ], tot_loss[loss=3.015, ArTop10Accuracy=0.7093, over 7781.68 frames. ], batch size: 34, lr: 1.84e-02
2024-08-06 04:25:16,676 INFO [trainer.py:765] (5/8) Epoch 6, batch 300, train_loss[loss=2.984, ArTop10Accuracy=0.7149, over 14326.00 frames. ], tot_loss[loss=3.01, ArTop10Accuracy=0.7101, over 9419.02 frames. ], batch size: 44, lr: 1.83e-02
2024-08-06 04:26:08,924 INFO [trainer.py:765] (5/8) Epoch 6, batch 400, train_loss[loss=2.891, ArTop10Accuracy=0.7361, over 10355.00 frames. ], tot_loss[loss=3.008, ArTop10Accuracy=0.7106, over 10336.18 frames. ], batch size: 14, lr: 1.83e-02
2024-08-06 04:26:51,485 INFO [trainer.py:765] (5/8) Epoch 6, batch 500, train_loss[loss=2.96, ArTop10Accuracy=0.7243, over 12225.00 frames. ], tot_loss[loss=3.01, ArTop10Accuracy=0.71, over 10905.80 frames. ], batch size: 22, lr: 1.82e-02
2024-08-06 04:27:39,298 INFO [trainer.py:765] (5/8) Epoch 6, batch 600, train_loss[loss=3.017, ArTop10Accuracy=0.7087, over 11651.00 frames. ], tot_loss[loss=3.009, ArTop10Accuracy=0.7098, over 11422.13 frames. ], batch size: 18, lr: 1.81e-02
2024-08-06 04:27:46,370 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.054e+02 1.343e+02 1.474e+02 1.660e+02 8.574e+02, threshold=2.947e+02, percent-clipped=0.6
2024-08-06 04:28:33,240 INFO [trainer.py:765] (5/8) Epoch 6, batch 700, train_loss[loss=3.031, ArTop10Accuracy=0.6912, over 10022.00 frames. ], tot_loss[loss=3.021, ArTop10Accuracy=0.7076, over 11562.47 frames. ], batch size: 12, lr: 1.80e-02
2024-08-06 04:29:11,216 INFO [trainer.py:765] (5/8) Epoch 6, batch 800, train_loss[loss=2.88, ArTop10Accuracy=0.7336, over 9392.00 frames. ], tot_loss[loss=3.027, ArTop10Accuracy=0.7067, over 11663.92 frames. ], batch size: 11, lr: 1.79e-02
2024-08-06 04:29:42,752 INFO [trainer.py:765] (5/8) Epoch 6, batch 900, train_loss[loss=2.986, ArTop10Accuracy=0.7112, over 12830.00 frames. ], tot_loss[loss=3.017, ArTop10Accuracy=0.7085, over 11716.20 frames. ], batch size: 27, lr: 1.78e-02
2024-08-06 04:30:14,306 INFO [trainer.py:765] (5/8) Epoch 6, batch 1000, train_loss[loss=2.923, ArTop10Accuracy=0.7397, over 12980.00 frames. ], tot_loss[loss=3.021, ArTop10Accuracy=0.7075, over 11927.22 frames. ], batch size: 27, lr: 1.77e-02
2024-08-06 04:30:45,384 INFO [trainer.py:765] (5/8) Epoch 6, batch 1100, train_loss[loss=3.095, ArTop10Accuracy=0.6964, over 13609.00 frames. ], tot_loss[loss=3.026, ArTop10Accuracy=0.7068, over 11981.75 frames. ], batch size: 34, lr: 1.77e-02
2024-08-06 04:31:15,673 INFO [trainer.py:765] (5/8) Epoch 6, batch 1200, train_loss[loss=3.138, ArTop10Accuracy=0.6884, over 11948.00 frames. ], tot_loss[loss=3.026, ArTop10Accuracy=0.7065, over 11922.32 frames. ], batch size: 97, lr: 1.76e-02
2024-08-06 04:31:40,725 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 04:32:52,405 INFO [trainer.py:765] (5/8) Epoch 7, batch 100, train_loss[loss=3.055, ArTop10Accuracy=0.704, over 14658.00 frames. ], tot_loss[loss=2.993, ArTop10Accuracy=0.7144, over 4807.38 frames. ], batch size: 61, lr: 1.64e-02
2024-08-06 04:33:38,224 INFO [trainer.py:765] (5/8) Epoch 7, batch 200, train_loss[loss=2.968, ArTop10Accuracy=0.7192, over 13532.00 frames. ], tot_loss[loss=2.98, ArTop10Accuracy=0.7164, over 7806.01 frames. ], batch size: 34, lr: 1.64e-02
2024-08-06 04:34:22,609 INFO [trainer.py:765] (5/8) Epoch 7, batch 300, train_loss[loss=3.067, ArTop10Accuracy=0.6953, over 14506.00 frames. ], tot_loss[loss=2.981, ArTop10Accuracy=0.7162, over 9431.21 frames. ], batch size: 44, lr: 1.63e-02
2024-08-06 04:34:36,847 INFO [trainer.py:803] (5/8) Computing validation loss
2024-08-06 04:34:45,808 INFO [trainer.py:811] (5/8) Epoch 7, validation: loss=2.963, ArTop10Accuracy=0.7233, over 1829298.00 frames.
2024-08-06 04:34:45,809 INFO [trainer.py:814] (5/8) Maximum memory allocated so far is 33249MB
2024-08-06 04:34:46,125 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.009e+02 1.306e+02 1.435e+02 1.599e+02 8.689e+02, threshold=2.871e+02, percent-clipped=0.9
2024-08-06 04:35:17,146 INFO [trainer.py:765] (5/8) Epoch 7, batch 400, train_loss[loss=2.841, ArTop10Accuracy=0.7387, over 10467.00 frames. ], tot_loss[loss=2.978, ArTop10Accuracy=0.7164, over 10340.88 frames. ], batch size: 14, lr: 1.62e-02
2024-08-06 04:36:01,711 INFO [trainer.py:765] (5/8) Epoch 7, batch 500, train_loss[loss=2.971, ArTop10Accuracy=0.715, over 12329.00 frames. ], tot_loss[loss=2.984, ArTop10Accuracy=0.7152, over 10907.27 frames. ], batch size: 22, lr: 1.61e-02
2024-08-06 04:36:48,811 INFO [trainer.py:765] (5/8) Epoch 7, batch 600, train_loss[loss=2.926, ArTop10Accuracy=0.7292, over 11628.00 frames. ], tot_loss[loss=2.988, ArTop10Accuracy=0.714, over 11413.22 frames. ], batch size: 18, lr: 1.61e-02
2024-08-06 04:37:34,800 INFO [trainer.py:765] (5/8) Epoch 7, batch 700, train_loss[loss=3.087, ArTop10Accuracy=0.6988, over 10024.00 frames. ], tot_loss[loss=2.994, ArTop10Accuracy=0.7128, over 11560.62 frames. ], batch size: 12, lr: 1.60e-02
2024-08-06 04:38:13,613 INFO [trainer.py:765] (5/8) Epoch 7, batch 800, train_loss[loss=3.024, ArTop10Accuracy=0.7085, over 10156.00 frames. ], tot_loss[loss=3.002, ArTop10Accuracy=0.7113, over 11686.56 frames. ], batch size: 12, lr: 1.59e-02
2024-08-06 04:38:45,110 INFO [trainer.py:765] (5/8) Epoch 7, batch 900, train_loss[loss=3.019, ArTop10Accuracy=0.7157, over 12818.00 frames. ], tot_loss[loss=2.995, ArTop10Accuracy=0.7129, over 11753.12 frames. ], batch size: 27, lr: 1.59e-02
2024-08-06 04:39:16,575 INFO [trainer.py:765] (5/8) Epoch 7, batch 1000, train_loss[loss=3.124, ArTop10Accuracy=0.6866, over 12760.00 frames. ], tot_loss[loss=2.997, ArTop10Accuracy=0.7125, over 11950.82 frames. ], batch size: 27, lr: 1.58e-02
2024-08-06 04:39:47,571 INFO [trainer.py:765] (5/8) Epoch 7, batch 1100, train_loss[loss=3.03, ArTop10Accuracy=0.705, over 13485.00 frames. ], tot_loss[loss=3.005, ArTop10Accuracy=0.7111, over 11978.12 frames. ], batch size: 34, lr: 1.57e-02
2024-08-06 04:40:17,990 INFO [trainer.py:765] (5/8) Epoch 7, batch 1200, train_loss[loss=3.173, ArTop10Accuracy=0.6773, over 12137.00 frames. ], tot_loss[loss=2.997, ArTop10Accuracy=0.7122, over 11932.50 frames. ], batch size: 97, lr: 1.57e-02
2024-08-06 04:40:43,269 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 04:41:37,492 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 9.816e+01 1.295e+02 1.411e+02 1.574e+02 4.953e+02, threshold=2.821e+02, percent-clipped=1.1
2024-08-06 04:41:58,371 INFO [trainer.py:765] (5/8) Epoch 8, batch 100, train_loss[loss=2.956, ArTop10Accuracy=0.7218, over 14565.00 frames. ], tot_loss[loss=2.963, ArTop10Accuracy=0.7205, over 4781.31 frames. ], batch size: 61, lr: 1.47e-02
2024-08-06 04:42:44,986 INFO [trainer.py:765] (5/8) Epoch 8, batch 200, train_loss[loss=2.915, ArTop10Accuracy=0.7326, over 13588.00 frames. ], tot_loss[loss=2.95, ArTop10Accuracy=0.7227, over 7779.99 frames. ], batch size: 34, lr: 1.46e-02
2024-08-06 04:43:28,045 INFO [trainer.py:765] (5/8) Epoch 8, batch 300, train_loss[loss=3.011, ArTop10Accuracy=0.7113, over 14521.00 frames. ], tot_loss[loss=2.947, ArTop10Accuracy=0.723, over 9412.24 frames. ], batch size: 44, lr: 1.46e-02
2024-08-06 04:44:14,461 INFO [trainer.py:765] (5/8) Epoch 8, batch 400, train_loss[loss=2.832, ArTop10Accuracy=0.7502, over 10926.00 frames. ], tot_loss[loss=2.947, ArTop10Accuracy=0.7225, over 10339.95 frames. ], batch size: 15, lr: 1.45e-02
2024-08-06 04:45:00,691 INFO [trainer.py:765] (5/8) Epoch 8, batch 500, train_loss[loss=2.895, ArTop10Accuracy=0.7333, over 12150.00 frames. ], tot_loss[loss=2.951, ArTop10Accuracy=0.7217, over 10895.71 frames. ], batch size: 22, lr: 1.45e-02
2024-08-06 04:45:45,393 INFO [trainer.py:765] (5/8) Epoch 8, batch 600, train_loss[loss=3.039, ArTop10Accuracy=0.712, over 11707.00 frames. ], tot_loss[loss=2.961, ArTop10Accuracy=0.7197, over 11435.04 frames. ], batch size: 18, lr: 1.44e-02
2024-08-06 04:46:34,037 INFO [trainer.py:765] (5/8) Epoch 8, batch 700, train_loss[loss=2.897, ArTop10Accuracy=0.7229, over 10093.00 frames. ], tot_loss[loss=2.967, ArTop10Accuracy=0.7184, over 11595.78 frames. ], batch size: 12, lr: 1.43e-02
2024-08-06 04:47:10,208 INFO [trainer.py:765] (5/8) Epoch 8, batch 800, train_loss[loss=2.983, ArTop10Accuracy=0.7092, over 10196.00 frames. ], tot_loss[loss=2.971, ArTop10Accuracy=0.7176, over 11709.49 frames. ], batch size: 12, lr: 1.43e-02
2024-08-06 04:47:41,606 INFO [trainer.py:765] (5/8) Epoch 8, batch 900, train_loss[loss=2.917, ArTop10Accuracy=0.7312, over 12961.00 frames. ], tot_loss[loss=2.964, ArTop10Accuracy=0.7189, over 11748.63 frames. ], batch size: 27, lr: 1.42e-02
2024-08-06 04:48:13,033 INFO [trainer.py:765] (5/8) Epoch 8, batch 1000, train_loss[loss=2.901, ArTop10Accuracy=0.7342, over 12912.00 frames. ], tot_loss[loss=2.967, ArTop10Accuracy=0.7186, over 11953.22 frames. ], batch size: 27, lr: 1.42e-02
2024-08-06 04:48:28,828 INFO [trainer.py:803] (5/8) Computing validation loss
2024-08-06 04:48:37,663 INFO [trainer.py:811] (5/8) Epoch 8, validation: loss=2.946, ArTop10Accuracy=0.7266, over 1829298.00 frames.
2024-08-06 04:48:37,664 INFO [trainer.py:814] (5/8) Maximum memory allocated so far is 33249MB
2024-08-06 04:48:37,951 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.035e+02 1.289e+02 1.393e+02 1.532e+02 3.557e+02, threshold=2.786e+02, percent-clipped=0.2
2024-08-06 04:48:52,932 INFO [trainer.py:765] (5/8) Epoch 8, batch 1100, train_loss[loss=2.94, ArTop10Accuracy=0.7232, over 13753.00 frames. ], tot_loss[loss=2.968, ArTop10Accuracy=0.718, over 12000.96 frames. ], batch size: 34, lr: 1.41e-02
2024-08-06 04:49:23,202 INFO [trainer.py:765] (5/8) Epoch 8, batch 1200, train_loss[loss=3.118, ArTop10Accuracy=0.6896, over 12209.00 frames. ], tot_loss[loss=2.968, ArTop10Accuracy=0.7178, over 11948.11 frames. ], batch size: 97, lr: 1.40e-02
2024-08-06 04:49:48,652 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 04:51:01,548 INFO [trainer.py:765] (5/8) Epoch 9, batch 100, train_loss[loss=3.038, ArTop10Accuracy=0.7066, over 14432.00 frames. ], tot_loss[loss=2.946, ArTop10Accuracy=0.7233, over 4777.24 frames. ], batch size: 61, lr: 1.32e-02
2024-08-06 04:51:45,415 INFO [trainer.py:765] (5/8) Epoch 9, batch 200, train_loss[loss=3.004, ArTop10Accuracy=0.7104, over 13554.00 frames. ], tot_loss[loss=2.939, ArTop10Accuracy=0.7247, over 7781.26 frames. ], batch size: 34, lr: 1.32e-02
2024-08-06 04:52:29,082 INFO [trainer.py:765] (5/8) Epoch 9, batch 300, train_loss[loss=2.969, ArTop10Accuracy=0.7192, over 13959.00 frames. ], tot_loss[loss=2.937, ArTop10Accuracy=0.7254, over 9430.97 frames. ], batch size: 44, lr: 1.31e-02
2024-08-06 04:53:16,431 INFO [trainer.py:765] (5/8) Epoch 9, batch 400, train_loss[loss=2.894, ArTop10Accuracy=0.7299, over 10772.00 frames. ], tot_loss[loss=2.937, ArTop10Accuracy=0.7249, over 10347.47 frames. ], batch size: 15, lr: 1.31e-02
2024-08-06 04:53:58,143 INFO [trainer.py:765] (5/8) Epoch 9, batch 500, train_loss[loss=3.015, ArTop10Accuracy=0.7047, over 12271.00 frames. ], tot_loss[loss=2.937, ArTop10Accuracy=0.7244, over 10911.02 frames. ], batch size: 22, lr: 1.30e-02
2024-08-06 04:54:51,077 INFO [trainer.py:765] (5/8) Epoch 9, batch 600, train_loss[loss=2.794, ArTop10Accuracy=0.7403, over 11511.00 frames. ], tot_loss[loss=2.94, ArTop10Accuracy=0.7237, over 11427.19 frames. ], batch size: 18, lr: 1.30e-02
2024-08-06 04:55:34,399 INFO [trainer.py:765] (5/8) Epoch 9, batch 700, train_loss[loss=2.959, ArTop10Accuracy=0.7114, over 10314.00 frames. ], tot_loss[loss=2.948, ArTop10Accuracy=0.7222, over 11569.37 frames. ], batch size: 12, lr: 1.29e-02
2024-08-06 04:56:04,575 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.029e+02 1.257e+02 1.367e+02 1.507e+02 8.820e+02, threshold=2.735e+02, percent-clipped=0.5
2024-08-06 04:56:13,598 INFO [trainer.py:765] (5/8) Epoch 9, batch 800, train_loss[loss=3.015, ArTop10Accuracy=0.7181, over 10057.00 frames. ], tot_loss[loss=2.95, ArTop10Accuracy=0.7216, over 11687.52 frames. ], batch size: 12, lr: 1.29e-02
2024-08-06 04:56:44,975 INFO [trainer.py:765] (5/8) Epoch 9, batch 900, train_loss[loss=3.023, ArTop10Accuracy=0.7055, over 12909.00 frames. ], tot_loss[loss=2.946, ArTop10Accuracy=0.7225, over 11724.74 frames. ], batch size: 27, lr: 1.28e-02
2024-08-06 04:57:16,492 INFO [trainer.py:765] (5/8) Epoch 9, batch 1000, train_loss[loss=3.026, ArTop10Accuracy=0.7122, over 13092.00 frames. ], tot_loss[loss=2.949, ArTop10Accuracy=0.722, over 11937.06 frames. ], batch size: 27, lr: 1.28e-02
2024-08-06 04:57:47,656 INFO [trainer.py:765] (5/8) Epoch 9, batch 1100, train_loss[loss=2.969, ArTop10Accuracy=0.7216, over 13605.00 frames. ], tot_loss[loss=2.956, ArTop10Accuracy=0.7207, over 11991.87 frames. ], batch size: 34, lr: 1.27e-02
2024-08-06 04:58:18,093 INFO [trainer.py:765] (5/8) Epoch 9, batch 1200, train_loss[loss=3.079, ArTop10Accuracy=0.6953, over 11790.00 frames. ], tot_loss[loss=2.962, ArTop10Accuracy=0.7196, over 11919.85 frames. ], batch size: 98, lr: 1.27e-02
2024-08-06 04:58:43,444 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 04:59:52,749 INFO [trainer.py:765] (5/8) Epoch 10, batch 100, train_loss[loss=3.014, ArTop10Accuracy=0.713, over 14680.00 frames. ], tot_loss[loss=2.931, ArTop10Accuracy=0.727, over 4772.51 frames. ], batch size: 61, lr: 1.20e-02
2024-08-06 05:00:43,730 INFO [trainer.py:765] (5/8) Epoch 10, batch 200, train_loss[loss=2.92, ArTop10Accuracy=0.7368, over 13794.00 frames. ], tot_loss[loss=2.919, ArTop10Accuracy=0.7287, over 7789.41 frames. ], batch size: 34, lr: 1.20e-02
2024-08-06 05:01:20,591 INFO [trainer.py:765] (5/8) Epoch 10, batch 300, train_loss[loss=3.037, ArTop10Accuracy=0.7069, over 14572.00 frames. ], tot_loss[loss=2.915, ArTop10Accuracy=0.7292, over 9410.37 frames. ], batch size: 44, lr: 1.19e-02
2024-08-06 05:02:10,048 INFO [trainer.py:765] (5/8) Epoch 10, batch 400, train_loss[loss=2.877, ArTop10Accuracy=0.7329, over 10954.00 frames. ], tot_loss[loss=2.914, ArTop10Accuracy=0.7293, over 10344.37 frames. ], batch size: 15, lr: 1.19e-02
2024-08-06 05:02:46,488 INFO [trainer.py:803] (5/8) Computing validation loss
2024-08-06 05:02:55,377 INFO [trainer.py:811] (5/8) Epoch 10, validation: loss=2.927, ArTop10Accuracy=0.7304, over 1829298.00 frames.
2024-08-06 05:02:55,378 INFO [trainer.py:814] (5/8) Maximum memory allocated so far is 33249MB
2024-08-06 05:02:55,729 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.023e+02 1.269e+02 1.367e+02 1.518e+02 4.405e+02, threshold=2.733e+02, percent-clipped=0.4
2024-08-06 05:02:58,361 INFO [trainer.py:765] (5/8) Epoch 10, batch 500, train_loss[loss=2.897, ArTop10Accuracy=0.7318, over 12237.00 frames. ], tot_loss[loss=2.912, ArTop10Accuracy=0.7293, over 10910.35 frames. ], batch size: 22, lr: 1.19e-02
2024-08-06 05:03:48,229 INFO [trainer.py:765] (5/8) Epoch 10, batch 600, train_loss[loss=2.841, ArTop10Accuracy=0.7436, over 11691.00 frames. ], tot_loss[loss=2.921, ArTop10Accuracy=0.7277, over 11434.83 frames. ], batch size: 18, lr: 1.18e-02
2024-08-06 05:04:36,716 INFO [trainer.py:765] (5/8) Epoch 10, batch 700, train_loss[loss=2.945, ArTop10Accuracy=0.727, over 10092.00 frames. ], tot_loss[loss=2.927, ArTop10Accuracy=0.7264, over 11581.35 frames. ], batch size: 12, lr: 1.18e-02
2024-08-06 05:05:10,725 INFO [trainer.py:765] (5/8) Epoch 10, batch 800, train_loss[loss=2.908, ArTop10Accuracy=0.7233, over 10079.00 frames. ], tot_loss[loss=2.932, ArTop10Accuracy=0.7252, over 11689.31 frames. ], batch size: 12, lr: 1.17e-02
2024-08-06 05:05:42,245 INFO [trainer.py:765] (5/8) Epoch 10, batch 900, train_loss[loss=2.883, ArTop10Accuracy=0.7376, over 13043.00 frames. ], tot_loss[loss=2.926, ArTop10Accuracy=0.7264, over 11738.86 frames. ], batch size: 27, lr: 1.17e-02
2024-08-06 05:06:13,844 INFO [trainer.py:765] (5/8) Epoch 10, batch 1000, train_loss[loss=2.926, ArTop10Accuracy=0.7263, over 12952.00 frames. ], tot_loss[loss=2.937, ArTop10Accuracy=0.7246, over 11940.80 frames. ], batch size: 27, lr: 1.16e-02
2024-08-06 05:06:45,055 INFO [trainer.py:765] (5/8) Epoch 10, batch 1100, train_loss[loss=3.042, ArTop10Accuracy=0.7047, over 13760.00 frames. ], tot_loss[loss=2.938, ArTop10Accuracy=0.7243, over 12002.23 frames. ], batch size: 34, lr: 1.16e-02
2024-08-06 05:07:15,483 INFO [trainer.py:765] (5/8) Epoch 10, batch 1200, train_loss[loss=3.121, ArTop10Accuracy=0.6849, over 12120.00 frames. ], tot_loss[loss=2.94, ArTop10Accuracy=0.7243, over 11958.72 frames. ], batch size: 97, lr: 1.16e-02
2024-08-06 05:07:40,694 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 05:08:52,967 INFO [trainer.py:765] (5/8) Epoch 11, batch 100, train_loss[loss=2.937, ArTop10Accuracy=0.7192, over 14567.00 frames. ], tot_loss[loss=2.905, ArTop10Accuracy=0.7318, over 4793.45 frames. ], batch size: 61, lr: 1.10e-02
2024-08-06 05:09:41,277 INFO [trainer.py:765] (5/8) Epoch 11, batch 200, train_loss[loss=2.898, ArTop10Accuracy=0.7312, over 14043.00 frames. ], tot_loss[loss=2.9, ArTop10Accuracy=0.7327, over 7807.45 frames. ], batch size: 34, lr: 1.10e-02
2024-08-06 05:09:51,176 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.001e+02 1.278e+02 1.371e+02 1.502e+02 3.785e+02, threshold=2.743e+02, percent-clipped=0.3
2024-08-06 05:10:24,721 INFO [trainer.py:765] (5/8) Epoch 11, batch 300, train_loss[loss=2.97, ArTop10Accuracy=0.7165, over 14339.00 frames. ], tot_loss[loss=2.898, ArTop10Accuracy=0.733, over 9424.25 frames. ], batch size: 44, lr: 1.09e-02
2024-08-06 05:11:11,784 INFO [trainer.py:765] (5/8) Epoch 11, batch 400, train_loss[loss=2.766, ArTop10Accuracy=0.7584, over 10857.00 frames. ], tot_loss[loss=2.902, ArTop10Accuracy=0.7318, over 10351.17 frames. ], batch size: 15, lr: 1.09e-02
2024-08-06 05:11:52,692 INFO [trainer.py:765] (5/8) Epoch 11, batch 500, train_loss[loss=2.901, ArTop10Accuracy=0.7325, over 12308.00 frames. ], tot_loss[loss=2.896, ArTop10Accuracy=0.7328, over 10934.98 frames. ], batch size: 22, lr: 1.09e-02
2024-08-06 05:12:40,287 INFO [trainer.py:765] (5/8) Epoch 11, batch 600, train_loss[loss=2.842, ArTop10Accuracy=0.75, over 11632.00 frames. ], tot_loss[loss=2.9, ArTop10Accuracy=0.7319, over 11450.89 frames. ], batch size: 18, lr: 1.08e-02
2024-08-06 05:13:25,708 INFO [trainer.py:765] (5/8) Epoch 11, batch 700, train_loss[loss=3.059, ArTop10Accuracy=0.7049, over 10090.00 frames. ], tot_loss[loss=2.907, ArTop10Accuracy=0.7304, over 11613.90 frames. ], batch size: 12, lr: 1.08e-02
2024-08-06 05:14:04,206 INFO [trainer.py:765] (5/8) Epoch 11, batch 800, train_loss[loss=2.711, ArTop10Accuracy=0.7611, over 10006.00 frames. ], tot_loss[loss=2.916, ArTop10Accuracy=0.7287, over 11724.83 frames. ], batch size: 12, lr: 1.07e-02
2024-08-06 05:14:35,667 INFO [trainer.py:765] (5/8) Epoch 11, batch 900, train_loss[loss=2.932, ArTop10Accuracy=0.7206, over 13172.00 frames. ], tot_loss[loss=2.911, ArTop10Accuracy=0.7297, over 11748.89 frames. ], batch size: 27, lr: 1.07e-02
2024-08-06 05:15:07,263 INFO [trainer.py:765] (5/8) Epoch 11, batch 1000, train_loss[loss=2.941, ArTop10Accuracy=0.7212, over 12987.00 frames. ], tot_loss[loss=2.918, ArTop10Accuracy=0.728, over 11920.40 frames. ], batch size: 27, lr: 1.07e-02
2024-08-06 05:15:38,261 INFO [trainer.py:765] (5/8) Epoch 11, batch 1100, train_loss[loss=2.858, ArTop10Accuracy=0.7273, over 13739.00 frames. ], tot_loss[loss=2.923, ArTop10Accuracy=0.7271, over 11993.56 frames. ], batch size: 34, lr: 1.06e-02
2024-08-06 05:16:08,498 INFO [trainer.py:765] (5/8) Epoch 11, batch 1200, train_loss[loss=3.114, ArTop10Accuracy=0.6951, over 11855.00 frames. ], tot_loss[loss=2.922, ArTop10Accuracy=0.7272, over 11930.92 frames. ], batch size: 97, lr: 1.06e-02
2024-08-06 05:16:12,698 INFO [trainer.py:803] (5/8) Computing validation loss
2024-08-06 05:16:21,622 INFO [trainer.py:811] (5/8) Epoch 11, validation: loss=2.923, ArTop10Accuracy=0.7318, over 1829298.00 frames.
2024-08-06 05:16:21,623 INFO [trainer.py:814] (5/8) Maximum memory allocated so far is 33249MB
2024-08-06 05:16:21,949 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.076e+02 1.268e+02 1.368e+02 1.481e+02 4.790e+02, threshold=2.736e+02, percent-clipped=0.6
2024-08-06 05:16:42,523 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 05:18:03,006 INFO [trainer.py:765] (5/8) Epoch 12, batch 100, train_loss[loss=2.966, ArTop10Accuracy=0.7207, over 14796.00 frames. ], tot_loss[loss=2.892, ArTop10Accuracy=0.7338, over 4788.81 frames. ], batch size: 61, lr: 1.01e-02
2024-08-06 05:18:46,005 INFO [trainer.py:765] (5/8) Epoch 12, batch 200, train_loss[loss=2.89, ArTop10Accuracy=0.7343, over 13601.00 frames. ], tot_loss[loss=2.887, ArTop10Accuracy=0.7347, over 7816.76 frames. ], batch size: 34, lr: 1.01e-02
2024-08-06 05:19:31,946 INFO [trainer.py:765] (5/8) Epoch 12, batch 300, train_loss[loss=2.991, ArTop10Accuracy=0.7133, over 14200.00 frames. ], tot_loss[loss=2.887, ArTop10Accuracy=0.7348, over 9430.75 frames. ], batch size: 44, lr: 1.01e-02
2024-08-06 05:20:12,431 INFO [trainer.py:765] (5/8) Epoch 12, batch 400, train_loss[loss=2.786, ArTop10Accuracy=0.7553, over 10949.00 frames. ], tot_loss[loss=2.887, ArTop10Accuracy=0.7348, over 10368.71 frames. ], batch size: 15, lr: 1.00e-02
2024-08-06 05:21:00,640 INFO [trainer.py:765] (5/8) Epoch 12, batch 500, train_loss[loss=2.895, ArTop10Accuracy=0.7366, over 12408.00 frames. ], tot_loss[loss=2.886, ArTop10Accuracy=0.7348, over 10915.69 frames. ], batch size: 22, lr: 9.99e-03
2024-08-06 05:21:43,916 INFO [trainer.py:765] (5/8) Epoch 12, batch 600, train_loss[loss=2.869, ArTop10Accuracy=0.7357, over 11756.00 frames. ], tot_loss[loss=2.889, ArTop10Accuracy=0.7342, over 11437.24 frames. ], batch size: 18, lr: 9.96e-03
2024-08-06 05:22:32,206 INFO [trainer.py:765] (5/8) Epoch 12, batch 700, train_loss[loss=2.898, ArTop10Accuracy=0.7291, over 9976.00 frames. ], tot_loss[loss=2.896, ArTop10Accuracy=0.7326, over 11586.51 frames. ], batch size: 12, lr: 9.93e-03
2024-08-06 05:23:08,911 INFO [trainer.py:765] (5/8) Epoch 12, batch 800, train_loss[loss=2.991, ArTop10Accuracy=0.7164, over 9276.00 frames. ], tot_loss[loss=2.903, ArTop10Accuracy=0.7314, over 11706.98 frames. ], batch size: 11, lr: 9.90e-03
2024-08-06 05:23:40,460 INFO [trainer.py:765] (5/8) Epoch 12, batch 900, train_loss[loss=2.842, ArTop10Accuracy=0.75, over 12874.00 frames. ], tot_loss[loss=2.897, ArTop10Accuracy=0.7326, over 11760.21 frames. ], batch size: 27, lr: 9.87e-03
2024-08-06 05:23:54,577 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.067e+02 1.273e+02 1.376e+02 1.503e+02 4.050e+02, threshold=2.752e+02, percent-clipped=0.4
2024-08-06 05:24:14,346 INFO [trainer.py:765] (5/8) Epoch 12, batch 1000, train_loss[loss=2.907, ArTop10Accuracy=0.7301, over 13063.00 frames. ], tot_loss[loss=2.897, ArTop10Accuracy=0.7323, over 11944.31 frames. ], batch size: 27, lr: 9.84e-03
2024-08-06 05:24:45,501 INFO [trainer.py:765] (5/8) Epoch 12, batch 1100, train_loss[loss=2.942, ArTop10Accuracy=0.7275, over 14028.00 frames. ], tot_loss[loss=2.908, ArTop10Accuracy=0.7303, over 11991.42 frames. ], batch size: 34, lr: 9.81e-03
2024-08-06 05:25:15,882 INFO [trainer.py:765] (5/8) Epoch 12, batch 1200, train_loss[loss=3.043, ArTop10Accuracy=0.7045, over 11848.00 frames. ], tot_loss[loss=2.909, ArTop10Accuracy=0.7302, over 11935.06 frames. ], batch size: 97, lr: 9.78e-03
2024-08-06 05:25:41,190 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 05:26:46,787 INFO [trainer.py:765] (5/8) Epoch 13, batch 100, train_loss[loss=2.986, ArTop10Accuracy=0.713, over 14626.00 frames. ], tot_loss[loss=2.874, ArTop10Accuracy=0.7377, over 4775.81 frames. ], batch size: 61, lr: 9.36e-03
2024-08-06 05:27:32,553 INFO [trainer.py:765] (5/8) Epoch 13, batch 200, train_loss[loss=2.878, ArTop10Accuracy=0.7345, over 13751.00 frames. ], tot_loss[loss=2.873, ArTop10Accuracy=0.7379, over 7798.49 frames. ], batch size: 34, lr: 9.34e-03
2024-08-06 05:28:16,036 INFO [trainer.py:765] (5/8) Epoch 13, batch 300, train_loss[loss=2.99, ArTop10Accuracy=0.7187, over 14290.00 frames. ], tot_loss[loss=2.863, ArTop10Accuracy=0.7397, over 9411.87 frames. ], batch size: 44, lr: 9.31e-03
2024-08-06 05:29:00,149 INFO [trainer.py:765] (5/8) Epoch 13, batch 400, train_loss[loss=2.804, ArTop10Accuracy=0.7396, over 11075.00 frames. ], tot_loss[loss=2.872, ArTop10Accuracy=0.7381, over 10347.05 frames. ], batch size: 15, lr: 9.28e-03
2024-08-06 05:29:43,967 INFO [trainer.py:765] (5/8) Epoch 13, batch 500, train_loss[loss=2.766, ArTop10Accuracy=0.7527, over 12172.00 frames. ], tot_loss[loss=2.872, ArTop10Accuracy=0.7376, over 10906.91 frames. ], batch size: 22, lr: 9.26e-03
2024-08-06 05:30:24,247 INFO [trainer.py:765] (5/8) Epoch 13, batch 600, train_loss[loss=2.868, ArTop10Accuracy=0.7391, over 11584.00 frames. ], tot_loss[loss=2.873, ArTop10Accuracy=0.7374, over 11413.24 frames. ], batch size: 18, lr: 9.23e-03
2024-08-06 05:30:58,110 INFO [trainer.py:803] (5/8) Computing validation loss
2024-08-06 05:31:07,054 INFO [trainer.py:811] (5/8) Epoch 13, validation: loss=2.918, ArTop10Accuracy=0.733, over 1829298.00 frames.
2024-08-06 05:31:07,055 INFO [trainer.py:814] (5/8) Maximum memory allocated so far is 33249MB
2024-08-06 05:31:07,351 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.049e+02 1.283e+02 1.389e+02 1.496e+02 2.729e+02, threshold=2.779e+02, percent-clipped=0.0
2024-08-06 05:31:24,043 INFO [trainer.py:765] (5/8) Epoch 13, batch 700, train_loss[loss=2.845, ArTop10Accuracy=0.7507, over 10064.00 frames. ], tot_loss[loss=2.884, ArTop10Accuracy=0.7352, over 11570.16 frames. ], batch size: 12, lr: 9.20e-03
2024-08-06 05:32:00,147 INFO [trainer.py:765] (5/8) Epoch 13, batch 800, train_loss[loss=2.783, ArTop10Accuracy=0.745, over 9973.00 frames. ], tot_loss[loss=2.887, ArTop10Accuracy=0.7345, over 11676.01 frames. ], batch size: 12, lr: 9.18e-03
2024-08-06 05:32:31,522 INFO [trainer.py:765] (5/8) Epoch 13, batch 900, train_loss[loss=2.924, ArTop10Accuracy=0.7292, over 12966.00 frames. ], tot_loss[loss=2.885, ArTop10Accuracy=0.7349, over 11722.13 frames. ], batch size: 27, lr: 9.15e-03
2024-08-06 05:33:03,043 INFO [trainer.py:765] (5/8) Epoch 13, batch 1000, train_loss[loss=2.894, ArTop10Accuracy=0.7336, over 12922.00 frames. ], tot_loss[loss=2.889, ArTop10Accuracy=0.7343, over 11908.18 frames. ], batch size: 27, lr: 9.13e-03
2024-08-06 05:33:34,232 INFO [trainer.py:765] (5/8) Epoch 13, batch 1100, train_loss[loss=2.901, ArTop10Accuracy=0.7322, over 13749.00 frames. ], tot_loss[loss=2.898, ArTop10Accuracy=0.7324, over 11996.25 frames. ], batch size: 34, lr: 9.10e-03
2024-08-06 05:34:04,519 INFO [trainer.py:765] (5/8) Epoch 13, batch 1200, train_loss[loss=2.983, ArTop10Accuracy=0.7158, over 11719.00 frames. ], tot_loss[loss=2.899, ArTop10Accuracy=0.7324, over 11936.10 frames. ], batch size: 99, lr: 9.07e-03
2024-08-06 05:34:29,787 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 05:35:39,198 INFO [trainer.py:765] (5/8) Epoch 14, batch 100, train_loss[loss=2.966, ArTop10Accuracy=0.7234, over 14614.00 frames. ], tot_loss[loss=2.866, ArTop10Accuracy=0.7396, over 4788.57 frames. ], batch size: 61, lr: 8.71e-03
2024-08-06 05:36:23,063 INFO [trainer.py:765] (5/8) Epoch 14, batch 200, train_loss[loss=2.882, ArTop10Accuracy=0.7387, over 13887.00 frames. ], tot_loss[loss=2.86, ArTop10Accuracy=0.7408, over 7776.78 frames. ], batch size: 34, lr: 8.68e-03
2024-08-06 05:37:09,311 INFO [trainer.py:765] (5/8) Epoch 14, batch 300, train_loss[loss=2.932, ArTop10Accuracy=0.7357, over 14494.00 frames. ], tot_loss[loss=2.854, ArTop10Accuracy=0.7416, over 9433.89 frames. ], batch size: 44, lr: 8.66e-03
2024-08-06 05:37:46,030 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.097e+02 1.304e+02 1.410e+02 1.531e+02 2.912e+02, threshold=2.820e+02, percent-clipped=0.2
2024-08-06 05:37:55,139 INFO [trainer.py:765] (5/8) Epoch 14, batch 400, train_loss[loss=2.878, ArTop10Accuracy=0.7394, over 11072.00 frames. ], tot_loss[loss=2.86, ArTop10Accuracy=0.7407, over 10346.32 frames. ], batch size: 15, lr: 8.64e-03
2024-08-06 05:38:42,025 INFO [trainer.py:765] (5/8) Epoch 14, batch 500, train_loss[loss=2.941, ArTop10Accuracy=0.7272, over 12282.00 frames. ], tot_loss[loss=2.863, ArTop10Accuracy=0.7398, over 10900.23 frames. ], batch size: 22, lr: 8.61e-03
2024-08-06 05:39:22,374 INFO [trainer.py:765] (5/8) Epoch 14, batch 600, train_loss[loss=2.826, ArTop10Accuracy=0.7509, over 11567.00 frames. ], tot_loss[loss=2.863, ArTop10Accuracy=0.7392, over 11428.54 frames. ], batch size: 18, lr: 8.59e-03
2024-08-06 05:40:15,144 INFO [trainer.py:765] (5/8) Epoch 14, batch 700, train_loss[loss=2.749, ArTop10Accuracy=0.768, over 10135.00 frames. ], tot_loss[loss=2.872, ArTop10Accuracy=0.7375, over 11572.30 frames. ], batch size: 12, lr: 8.57e-03
2024-08-06 05:40:49,136 INFO [trainer.py:765] (5/8) Epoch 14, batch 800, train_loss[loss=2.793, ArTop10Accuracy=0.7499, over 10077.00 frames. ], tot_loss[loss=2.874, ArTop10Accuracy=0.7368, over 11706.79 frames. ], batch size: 12, lr: 8.55e-03
2024-08-06 05:41:20,467 INFO [trainer.py:765] (5/8) Epoch 14, batch 900, train_loss[loss=2.872, ArTop10Accuracy=0.7384, over 13285.00 frames. ], tot_loss[loss=2.862, ArTop10Accuracy=0.739, over 11752.36 frames. ], batch size: 28, lr: 8.52e-03
2024-08-06 05:41:51,996 INFO [trainer.py:765] (5/8) Epoch 14, batch 1000, train_loss[loss=2.878, ArTop10Accuracy=0.7373, over 13032.00 frames. ], tot_loss[loss=2.865, ArTop10Accuracy=0.7385, over 11945.61 frames. ], batch size: 27, lr: 8.50e-03
2024-08-06 05:42:23,217 INFO [trainer.py:765] (5/8) Epoch 14, batch 1100, train_loss[loss=2.913, ArTop10Accuracy=0.7271, over 13720.00 frames. ], tot_loss[loss=2.877, ArTop10Accuracy=0.7362, over 11998.01 frames. ], batch size: 34, lr: 8.48e-03
2024-08-06 05:42:53,549 INFO [trainer.py:765] (5/8) Epoch 14, batch 1200, train_loss[loss=3.009, ArTop10Accuracy=0.7162, over 12218.00 frames. ], tot_loss[loss=2.881, ArTop10Accuracy=0.7356, over 11940.21 frames. ], batch size: 99, lr: 8.46e-03
2024-08-06 05:43:18,804 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 05:44:28,571 INFO [trainer.py:765] (5/8) Epoch 15, batch 100, train_loss[loss=2.916, ArTop10Accuracy=0.7277, over 14505.00 frames. ], tot_loss[loss=2.847, ArTop10Accuracy=0.7427, over 4774.84 frames. ], batch size: 62, lr: 8.14e-03
2024-08-06 05:44:29,213 INFO [trainer.py:803] (5/8) Computing validation loss
2024-08-06 05:44:38,023 INFO [trainer.py:811] (5/8) Epoch 15, validation: loss=2.913, ArTop10Accuracy=0.7339, over 1829298.00 frames.
2024-08-06 05:44:38,024 INFO [trainer.py:814] (5/8) Maximum memory allocated so far is 33249MB
2024-08-06 05:44:38,413 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.100e+02 1.307e+02 1.417e+02 1.528e+02 2.981e+02, threshold=2.833e+02, percent-clipped=0.1
2024-08-06 05:45:20,185 INFO [trainer.py:765] (5/8) Epoch 15, batch 200, train_loss[loss=2.899, ArTop10Accuracy=0.7329, over 13933.00 frames. ], tot_loss[loss=2.846, ArTop10Accuracy=0.7431, over 7781.58 frames. ], batch size: 34, lr: 8.11e-03
2024-08-06 05:46:04,647 INFO [trainer.py:765] (5/8) Epoch 15, batch 300, train_loss[loss=2.887, ArTop10Accuracy=0.7367, over 14309.00 frames. ], tot_loss[loss=2.844, ArTop10Accuracy=0.7435, over 9416.21 frames. ], batch size: 44, lr: 8.09e-03
2024-08-06 05:46:51,902 INFO [trainer.py:765] (5/8) Epoch 15, batch 400, train_loss[loss=2.768, ArTop10Accuracy=0.7495, over 10263.00 frames. ], tot_loss[loss=2.847, ArTop10Accuracy=0.7428, over 10321.60 frames. ], batch size: 14, lr: 8.07e-03
2024-08-06 05:47:36,911 INFO [trainer.py:765] (5/8) Epoch 15, batch 500, train_loss[loss=2.699, ArTop10Accuracy=0.77, over 12305.00 frames. ], tot_loss[loss=2.844, ArTop10Accuracy=0.743, over 10893.49 frames. ], batch size: 22, lr: 8.05e-03
2024-08-06 05:48:24,723 INFO [trainer.py:765] (5/8) Epoch 15, batch 600, train_loss[loss=2.845, ArTop10Accuracy=0.7478, over 11519.00 frames. ], tot_loss[loss=2.846, ArTop10Accuracy=0.7424, over 11419.70 frames. ], batch size: 18, lr: 8.03e-03
2024-08-06 05:49:11,854 INFO [trainer.py:765] (5/8) Epoch 15, batch 700, train_loss[loss=2.881, ArTop10Accuracy=0.7378, over 10212.00 frames. ], tot_loss[loss=2.853, ArTop10Accuracy=0.7416, over 11566.88 frames. ], batch size: 12, lr: 8.01e-03
2024-08-06 05:49:45,779 INFO [trainer.py:765] (5/8) Epoch 15, batch 800, train_loss[loss=2.616, ArTop10Accuracy=0.7703, over 9258.00 frames. ], tot_loss[loss=2.864, ArTop10Accuracy=0.7395, over 11686.86 frames. ], batch size: 11, lr: 7.99e-03
2024-08-06 05:50:17,210 INFO [trainer.py:765] (5/8) Epoch 15, batch 900, train_loss[loss=2.815, ArTop10Accuracy=0.7488, over 12825.00 frames. ], tot_loss[loss=2.855, ArTop10Accuracy=0.7409, over 11735.78 frames. ], batch size: 27, lr: 7.97e-03
2024-08-06 05:50:48,829 INFO [trainer.py:765] (5/8) Epoch 15, batch 1000, train_loss[loss=2.881, ArTop10Accuracy=0.7417, over 12877.00 frames. ], tot_loss[loss=2.859, ArTop10Accuracy=0.7402, over 11943.33 frames. ], batch size: 27, lr: 7.95e-03
2024-08-06 05:51:20,069 INFO [trainer.py:765] (5/8) Epoch 15, batch 1100, train_loss[loss=2.868, ArTop10Accuracy=0.7424, over 13599.00 frames. ], tot_loss[loss=2.865, ArTop10Accuracy=0.7392, over 11998.68 frames. ], batch size: 34, lr: 7.93e-03
2024-08-06 05:51:23,515 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.123e+02 1.337e+02 1.431e+02 1.541e+02 2.784e+02, threshold=2.862e+02, percent-clipped=0.0
2024-08-06 05:51:53,082 INFO [trainer.py:765] (5/8) Epoch 15, batch 1200, train_loss[loss=3.011, ArTop10Accuracy=0.7153, over 11780.00 frames. ], tot_loss[loss=2.869, ArTop10Accuracy=0.7382, over 11953.49 frames. ], batch size: 99, lr: 7.91e-03
2024-08-06 05:52:18,030 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 05:53:29,263 INFO [trainer.py:765] (5/8) Epoch 16, batch 100, train_loss[loss=2.965, ArTop10Accuracy=0.7277, over 14761.00 frames. ], tot_loss[loss=2.831, ArTop10Accuracy=0.7467, over 4762.90 frames. ], batch size: 61, lr: 7.63e-03
2024-08-06 05:54:12,878 INFO [trainer.py:765] (5/8) Epoch 16, batch 200, train_loss[loss=2.873, ArTop10Accuracy=0.7397, over 13755.00 frames. ], tot_loss[loss=2.83, ArTop10Accuracy=0.7465, over 7785.70 frames. ], batch size: 34, lr: 7.61e-03
2024-08-06 05:54:59,737 INFO [trainer.py:765] (5/8) Epoch 16, batch 300, train_loss[loss=2.916, ArTop10Accuracy=0.7231, over 14126.00 frames. ], tot_loss[loss=2.827, ArTop10Accuracy=0.7468, over 9405.54 frames. ], batch size: 44, lr: 7.59e-03
2024-08-06 05:55:41,931 INFO [trainer.py:765] (5/8) Epoch 16, batch 400, train_loss[loss=2.712, ArTop10Accuracy=0.7616, over 10183.00 frames. ], tot_loss[loss=2.828, ArTop10Accuracy=0.7464, over 10323.19 frames. ], batch size: 14, lr: 7.58e-03
2024-08-06 05:56:27,680 INFO [trainer.py:765] (5/8) Epoch 16, batch 500, train_loss[loss=2.853, ArTop10Accuracy=0.7371, over 12258.00 frames. ], tot_loss[loss=2.833, ArTop10Accuracy=0.7453, over 10879.50 frames. ], batch size: 22, lr: 7.56e-03
2024-08-06 05:57:12,440 INFO [trainer.py:765] (5/8) Epoch 16, batch 600, train_loss[loss=2.766, ArTop10Accuracy=0.7673, over 11722.00 frames. ], tot_loss[loss=2.839, ArTop10Accuracy=0.7444, over 11435.82 frames. ], batch size: 18, lr: 7.54e-03
2024-08-06 05:58:00,041 INFO [trainer.py:765] (5/8) Epoch 16, batch 700, train_loss[loss=2.886, ArTop10Accuracy=0.7349, over 10114.00 frames. ], tot_loss[loss=2.852, ArTop10Accuracy=0.7415, over 11570.45 frames. ], batch size: 12, lr: 7.52e-03
2024-08-06 05:58:34,024 INFO [trainer.py:765] (5/8) Epoch 16, batch 800, train_loss[loss=2.717, ArTop10Accuracy=0.7578, over 9291.00 frames. ], tot_loss[loss=2.858, ArTop10Accuracy=0.7406, over 11686.05 frames. ], batch size: 11, lr: 7.50e-03
2024-08-06 05:58:41,569 INFO [trainer.py:803] (5/8) Computing validation loss
2024-08-06 05:58:50,426 INFO [trainer.py:811] (5/8) Epoch 16, validation: loss=2.915, ArTop10Accuracy=0.7338, over 1829298.00 frames.
2024-08-06 05:58:50,427 INFO [trainer.py:814] (5/8) Maximum memory allocated so far is 33249MB
2024-08-06 05:58:50,730 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.121e+02 1.335e+02 1.445e+02 1.570e+02 3.252e+02, threshold=2.890e+02, percent-clipped=0.1
2024-08-06 05:59:14,321 INFO [trainer.py:765] (5/8) Epoch 16, batch 900, train_loss[loss=2.851, ArTop10Accuracy=0.7429, over 12940.00 frames. ], tot_loss[loss=2.844, ArTop10Accuracy=0.7432, over 11758.97 frames. ], batch size: 27, lr: 7.49e-03
2024-08-06 05:59:45,915 INFO [trainer.py:765] (5/8) Epoch 16, batch 1000, train_loss[loss=2.886, ArTop10Accuracy=0.7365, over 12968.00 frames. ], tot_loss[loss=2.844, ArTop10Accuracy=0.7431, over 11937.70 frames. ], batch size: 27, lr: 7.47e-03
2024-08-06 06:00:17,092 INFO [trainer.py:765] (5/8) Epoch 16, batch 1100, train_loss[loss=2.966, ArTop10Accuracy=0.7182, over 13702.00 frames. ], tot_loss[loss=2.858, ArTop10Accuracy=0.7403, over 11986.84 frames. ], batch size: 34, lr: 7.45e-03
2024-08-06 06:00:47,464 INFO [trainer.py:765] (5/8) Epoch 16, batch 1200, train_loss[loss=3.017, ArTop10Accuracy=0.7132, over 12539.00 frames. ], tot_loss[loss=2.864, ArTop10Accuracy=0.7394, over 11949.06 frames. ], batch size: 97, lr: 7.43e-03
2024-08-06 06:01:13,070 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 06:02:27,261 INFO [trainer.py:765] (5/8) Epoch 17, batch 100, train_loss[loss=2.847, ArTop10Accuracy=0.7444, over 14808.00 frames. ], tot_loss[loss=2.825, ArTop10Accuracy=0.7476, over 4788.46 frames. ], batch size: 61, lr: 7.18e-03
2024-08-06 06:03:11,851 INFO [trainer.py:765] (5/8) Epoch 17, batch 200, train_loss[loss=2.861, ArTop10Accuracy=0.7373, over 13755.00 frames. ], tot_loss[loss=2.821, ArTop10Accuracy=0.7484, over 7809.53 frames. ], batch size: 34, lr: 7.17e-03
2024-08-06 06:03:57,502 INFO [trainer.py:765] (5/8) Epoch 17, batch 300, train_loss[loss=2.831, ArTop10Accuracy=0.7449, over 14115.00 frames. ], tot_loss[loss=2.821, ArTop10Accuracy=0.7484, over 9430.76 frames. ], batch size: 44, lr: 7.15e-03
2024-08-06 06:04:42,838 INFO [trainer.py:765] (5/8) Epoch 17, batch 400, train_loss[loss=2.813, ArTop10Accuracy=0.7461, over 10334.00 frames. ], tot_loss[loss=2.824, ArTop10Accuracy=0.7475, over 10334.62 frames. ], batch size: 14, lr: 7.13e-03
2024-08-06 06:05:29,004 INFO [trainer.py:765] (5/8) Epoch 17, batch 500, train_loss[loss=2.851, ArTop10Accuracy=0.739, over 12347.00 frames. ], tot_loss[loss=2.825, ArTop10Accuracy=0.7474, over 10907.53 frames. ], batch size: 22, lr: 7.12e-03
2024-08-06 06:05:49,551 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.142e+02 1.359e+02 1.445e+02 1.551e+02 2.741e+02, threshold=2.891e+02, percent-clipped=0.0
2024-08-06 06:06:20,723 INFO [trainer.py:765] (5/8) Epoch 17, batch 600, train_loss[loss=2.903, ArTop10Accuracy=0.7349, over 11538.00 frames. ], tot_loss[loss=2.831, ArTop10Accuracy=0.7459, over 11417.79 frames. ], batch size: 18, lr: 7.10e-03
2024-08-06 06:07:04,694 INFO [trainer.py:765] (5/8) Epoch 17, batch 700, train_loss[loss=2.761, ArTop10Accuracy=0.7582, over 10257.00 frames. ], tot_loss[loss=2.842, ArTop10Accuracy=0.7437, over 11580.44 frames. ], batch size: 12, lr: 7.09e-03
2024-08-06 06:07:44,896 INFO [trainer.py:765] (5/8) Epoch 17, batch 800, train_loss[loss=2.715, ArTop10Accuracy=0.7664, over 10337.00 frames. ], tot_loss[loss=2.84, ArTop10Accuracy=0.7439, over 11692.70 frames. ], batch size: 12, lr: 7.07e-03
2024-08-06 06:08:16,385 INFO [trainer.py:765] (5/8) Epoch 17, batch 900, train_loss[loss=2.8, ArTop10Accuracy=0.7541, over 12753.00 frames. ], tot_loss[loss=2.831, ArTop10Accuracy=0.7455, over 11736.58 frames. ], batch size: 27, lr: 7.05e-03
2024-08-06 06:08:47,995 INFO [trainer.py:765] (5/8) Epoch 17, batch 1000, train_loss[loss=2.847, ArTop10Accuracy=0.7361, over 12807.00 frames. ], tot_loss[loss=2.837, ArTop10Accuracy=0.7446, over 11931.11 frames. ], batch size: 27, lr: 7.04e-03
2024-08-06 06:09:19,134 INFO [trainer.py:765] (5/8) Epoch 17, batch 1100, train_loss[loss=2.782, ArTop10Accuracy=0.759, over 13725.00 frames. ], tot_loss[loss=2.845, ArTop10Accuracy=0.7431, over 11983.41 frames. ], batch size: 34, lr: 7.02e-03
2024-08-06 06:09:49,445 INFO [trainer.py:765] (5/8) Epoch 17, batch 1200, train_loss[loss=2.994, ArTop10Accuracy=0.7166, over 12228.00 frames. ], tot_loss[loss=2.847, ArTop10Accuracy=0.7427, over 11924.42 frames. ], batch size: 99, lr: 7.01e-03
2024-08-06 06:10:15,298 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 06:11:23,101 INFO [trainer.py:765] (5/8) Epoch 18, batch 100, train_loss[loss=2.892, ArTop10Accuracy=0.7381, over 14867.00 frames. ], tot_loss[loss=2.823, ArTop10Accuracy=0.7479, over 4783.92 frames. ], batch size: 61, lr: 6.78e-03
2024-08-06 06:12:16,260 INFO [trainer.py:765] (5/8) Epoch 18, batch 200, train_loss[loss=2.75, ArTop10Accuracy=0.7603, over 13290.00 frames. ], tot_loss[loss=2.82, ArTop10Accuracy=0.7488, over 7795.58 frames. ], batch size: 33, lr: 6.77e-03
2024-08-06 06:12:40,317 INFO [trainer.py:803] (5/8) Computing validation loss
2024-08-06 06:12:48,991 INFO [trainer.py:811] (5/8) Epoch 18, validation: loss=2.916, ArTop10Accuracy=0.7343, over 1829298.00 frames.
2024-08-06 06:12:48,992 INFO [trainer.py:814] (5/8) Maximum memory allocated so far is 33249MB
2024-08-06 06:12:49,335 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.163e+02 1.377e+02 1.476e+02 1.588e+02 2.450e+02, threshold=2.952e+02, percent-clipped=0.0
2024-08-06 06:13:07,116 INFO [trainer.py:765] (5/8) Epoch 18, batch 300, train_loss[loss=2.946, ArTop10Accuracy=0.7246, over 14437.00 frames. ], tot_loss[loss=2.812, ArTop10Accuracy=0.7502, over 9417.70 frames. ], batch size: 44, lr: 6.75e-03
2024-08-06 06:13:54,098 INFO [trainer.py:765] (5/8) Epoch 18, batch 400, train_loss[loss=2.769, ArTop10Accuracy=0.7524, over 10271.00 frames. ], tot_loss[loss=2.811, ArTop10Accuracy=0.7503, over 10343.54 frames. ], batch size: 14, lr: 6.74e-03
2024-08-06 06:14:38,488 INFO [trainer.py:765] (5/8) Epoch 18, batch 500, train_loss[loss=2.746, ArTop10Accuracy=0.7527, over 12279.00 frames. ], tot_loss[loss=2.806, ArTop10Accuracy=0.7508, over 10909.53 frames. ], batch size: 22, lr: 6.73e-03
2024-08-06 06:15:23,628 INFO [trainer.py:765] (5/8) Epoch 18, batch 600, train_loss[loss=2.855, ArTop10Accuracy=0.7384, over 11684.00 frames. ], tot_loss[loss=2.817, ArTop10Accuracy=0.7486, over 11420.40 frames. ], batch size: 18, lr: 6.71e-03
2024-08-06 06:16:17,343 INFO [trainer.py:765] (5/8) Epoch 18, batch 700, train_loss[loss=2.693, ArTop10Accuracy=0.7776, over 10180.00 frames. ], tot_loss[loss=2.823, ArTop10Accuracy=0.7475, over 11547.38 frames. ], batch size: 12, lr: 6.70e-03
2024-08-06 06:16:51,428 INFO [trainer.py:765] (5/8) Epoch 18, batch 800, train_loss[loss=2.714, ArTop10Accuracy=0.7765, over 10189.00 frames. ], tot_loss[loss=2.825, ArTop10Accuracy=0.7469, over 11680.86 frames. ], batch size: 12, lr: 6.68e-03
2024-08-06 06:17:22,913 INFO [trainer.py:765] (5/8) Epoch 18, batch 900, train_loss[loss=2.93, ArTop10Accuracy=0.726, over 12879.00 frames. ], tot_loss[loss=2.821, ArTop10Accuracy=0.7475, over 11738.31 frames. ], batch size: 27, lr: 6.67e-03
2024-08-06 06:17:54,528 INFO [trainer.py:765] (5/8) Epoch 18, batch 1000, train_loss[loss=2.694, ArTop10Accuracy=0.766, over 12936.00 frames. ], tot_loss[loss=2.83, ArTop10Accuracy=0.7462, over 11945.04 frames. ], batch size: 27, lr: 6.65e-03
2024-08-06 06:18:25,662 INFO [trainer.py:765] (5/8) Epoch 18, batch 1100, train_loss[loss=2.865, ArTop10Accuracy=0.7363, over 13745.00 frames. ], tot_loss[loss=2.839, ArTop10Accuracy=0.7443, over 12002.10 frames. ], batch size: 34, lr: 6.64e-03
2024-08-06 06:18:55,971 INFO [trainer.py:765] (5/8) Epoch 18, batch 1200, train_loss[loss=2.938, ArTop10Accuracy=0.726, over 12394.00 frames. ], tot_loss[loss=2.836, ArTop10Accuracy=0.7447, over 11951.43 frames. ], batch size: 98, lr: 6.63e-03
2024-08-06 06:19:19,163 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.178e+02 1.387e+02 1.492e+02 1.607e+02 2.982e+02, threshold=2.983e+02, percent-clipped=0.1
2024-08-06 06:19:23,779 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 06:20:29,728 INFO [trainer.py:765] (5/8) Epoch 19, batch 100, train_loss[loss=2.899, ArTop10Accuracy=0.7371, over 14803.00 frames. ], tot_loss[loss=2.81, ArTop10Accuracy=0.7505, over 4775.27 frames. ], batch size: 62, lr: 6.43e-03
2024-08-06 06:21:11,275 INFO [trainer.py:765] (5/8) Epoch 19, batch 200, train_loss[loss=2.666, ArTop10Accuracy=0.775, over 14085.00 frames. ], tot_loss[loss=2.801, ArTop10Accuracy=0.7527, over 7770.99 frames. ], batch size: 35, lr: 6.41e-03
2024-08-06 06:21:56,078 INFO [trainer.py:765] (5/8) Epoch 19, batch 300, train_loss[loss=2.818, ArTop10Accuracy=0.7519, over 14261.00 frames. ], tot_loss[loss=2.795, ArTop10Accuracy=0.7537, over 9409.99 frames. ], batch size: 44, lr: 6.40e-03
2024-08-06 06:22:36,013 INFO [trainer.py:765] (5/8) Epoch 19, batch 400, train_loss[loss=2.827, ArTop10Accuracy=0.7469, over 11058.00 frames. ], tot_loss[loss=2.799, ArTop10Accuracy=0.7524, over 10316.35 frames. ], batch size: 15, lr: 6.39e-03
2024-08-06 06:23:18,998 INFO [trainer.py:765] (5/8) Epoch 19, batch 500, train_loss[loss=2.733, ArTop10Accuracy=0.7689, over 12300.00 frames. ], tot_loss[loss=2.801, ArTop10Accuracy=0.752, over 10877.78 frames. ], batch size: 22, lr: 6.37e-03
2024-08-06 06:24:03,686 INFO [trainer.py:765] (5/8) Epoch 19, batch 600, train_loss[loss=2.779, ArTop10Accuracy=0.7626, over 11511.00 frames. ], tot_loss[loss=2.807, ArTop10Accuracy=0.7508, over 11421.42 frames. ], batch size: 18, lr: 6.36e-03
2024-08-06 06:24:46,187 INFO [trainer.py:765] (5/8) Epoch 19, batch 700, train_loss[loss=2.66, ArTop10Accuracy=0.781, over 10008.00 frames. ], tot_loss[loss=2.813, ArTop10Accuracy=0.7494, over 11564.74 frames. ], batch size: 12, lr: 6.35e-03
2024-08-06 06:25:22,355 INFO [trainer.py:765] (5/8) Epoch 19, batch 800, train_loss[loss=2.887, ArTop10Accuracy=0.7291, over 10049.00 frames. ], tot_loss[loss=2.814, ArTop10Accuracy=0.7491, over 11678.24 frames. ], batch size: 12, lr: 6.33e-03
2024-08-06 06:25:53,625 INFO [trainer.py:765] (5/8) Epoch 19, batch 900, train_loss[loss=2.631, ArTop10Accuracy=0.7877, over 12965.00 frames. ], tot_loss[loss=2.805, ArTop10Accuracy=0.7507, over 11713.42 frames. ], batch size: 27, lr: 6.32e-03
2024-08-06 06:26:21,774 INFO [trainer.py:803] (5/8) Computing validation loss
2024-08-06 06:26:30,765 INFO [trainer.py:811] (5/8) Epoch 19, validation: loss=2.918, ArTop10Accuracy=0.733, over 1829298.00 frames.
2024-08-06 06:26:30,766 INFO [trainer.py:814] (5/8) Maximum memory allocated so far is 33249MB
2024-08-06 06:26:31,053 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.198e+02 1.416e+02 1.525e+02 1.662e+02 2.849e+02, threshold=3.050e+02, percent-clipped=0.0
2024-08-06 06:26:34,031 INFO [trainer.py:765] (5/8) Epoch 19, batch 1000, train_loss[loss=2.847, ArTop10Accuracy=0.7399, over 13042.00 frames. ], tot_loss[loss=2.817, ArTop10Accuracy=0.7484, over 11905.17 frames. ], batch size: 27, lr: 6.31e-03
2024-08-06 06:27:05,190 INFO [trainer.py:765] (5/8) Epoch 19, batch 1100, train_loss[loss=2.85, ArTop10Accuracy=0.7451, over 13800.00 frames. ], tot_loss[loss=2.828, ArTop10Accuracy=0.7463, over 11968.02 frames. ], batch size: 34, lr: 6.30e-03
2024-08-06 06:27:35,454 INFO [trainer.py:765] (5/8) Epoch 19, batch 1200, train_loss[loss=2.985, ArTop10Accuracy=0.7162, over 11824.00 frames. ], tot_loss[loss=2.826, ArTop10Accuracy=0.7468, over 11905.60 frames. ], batch size: 97, lr: 6.28e-03
2024-08-06 06:28:00,610 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 06:29:08,984 INFO [trainer.py:765] (5/8) Epoch 20, batch 100, train_loss[loss=2.817, ArTop10Accuracy=0.7474, over 14655.00 frames. ], tot_loss[loss=2.797, ArTop10Accuracy=0.7536, over 4791.92 frames. ], batch size: 61, lr: 6.10e-03
2024-08-06 06:29:50,318 INFO [trainer.py:765] (5/8) Epoch 20, batch 200, train_loss[loss=2.79, ArTop10Accuracy=0.7503, over 13650.00 frames. ], tot_loss[loss=2.793, ArTop10Accuracy=0.7543, over 7793.65 frames. ], batch size: 34, lr: 6.09e-03
2024-08-06 06:30:37,105 INFO [trainer.py:765] (5/8) Epoch 20, batch 300, train_loss[loss=2.833, ArTop10Accuracy=0.7478, over 14093.00 frames. ], tot_loss[loss=2.79, ArTop10Accuracy=0.7547, over 9417.07 frames. ], batch size: 43, lr: 6.08e-03
2024-08-06 06:31:16,354 INFO [trainer.py:765] (5/8) Epoch 20, batch 400, train_loss[loss=2.782, ArTop10Accuracy=0.7602, over 10860.00 frames. ], tot_loss[loss=2.791, ArTop10Accuracy=0.7543, over 10322.85 frames. ], batch size: 15, lr: 6.07e-03
2024-08-06 06:32:03,759 INFO [trainer.py:765] (5/8) Epoch 20, batch 500, train_loss[loss=2.847, ArTop10Accuracy=0.7415, over 12177.00 frames. ], tot_loss[loss=2.795, ArTop10Accuracy=0.7533, over 10899.29 frames. ], batch size: 22, lr: 6.05e-03
2024-08-06 06:32:43,357 INFO [trainer.py:765] (5/8) Epoch 20, batch 600, train_loss[loss=2.709, ArTop10Accuracy=0.7684, over 11645.00 frames. ], tot_loss[loss=2.798, ArTop10Accuracy=0.7528, over 11421.14 frames. ], batch size: 18, lr: 6.04e-03
2024-08-06 06:33:36,752 INFO [trainer.py:765] (5/8) Epoch 20, batch 700, train_loss[loss=2.792, ArTop10Accuracy=0.7455, over 9336.00 frames. ], tot_loss[loss=2.802, ArTop10Accuracy=0.7517, over 11568.97 frames. ], batch size: 11, lr: 6.03e-03
2024-08-06 06:33:43,830 INFO [optim.py:386] (5/8) Clipping_scale=2.0, grad-norm quartiles 1.196e+02 1.417e+02 1.526e+02 1.639e+02 3.791e+02, threshold=3.052e+02, percent-clipped=0.1
2024-08-06 06:34:13,304 INFO [trainer.py:765] (5/8) Epoch 20, batch 800, train_loss[loss=2.812, ArTop10Accuracy=0.7496, over 10384.00 frames. ], tot_loss[loss=2.806, ArTop10Accuracy=0.7508, over 11675.68 frames. ], batch size: 12, lr: 6.02e-03
2024-08-06 06:34:44,580 INFO [trainer.py:765] (5/8) Epoch 20, batch 900, train_loss[loss=2.887, ArTop10Accuracy=0.7393, over 12969.00 frames. ], tot_loss[loss=2.803, ArTop10Accuracy=0.7513, over 11732.17 frames. ], batch size: 27, lr: 6.01e-03
2024-08-06 06:35:16,139 INFO [trainer.py:765] (5/8) Epoch 20, batch 1000, train_loss[loss=2.768, ArTop10Accuracy=0.7536, over 13008.00 frames. ], tot_loss[loss=2.809, ArTop10Accuracy=0.7502, over 11937.31 frames. ], batch size: 27, lr: 6.00e-03
2024-08-06 06:35:47,214 INFO [trainer.py:765] (5/8) Epoch 20, batch 1100, train_loss[loss=2.785, ArTop10Accuracy=0.7584, over 13661.00 frames. ], tot_loss[loss=2.818, ArTop10Accuracy=0.7484, over 12013.73 frames. ], batch size: 34, lr: 5.99e-03
2024-08-06 06:36:17,439 INFO [trainer.py:765] (5/8) Epoch 20, batch 1200, train_loss[loss=2.982, ArTop10Accuracy=0.7139, over 11908.00 frames. ], tot_loss[loss=2.82, ArTop10Accuracy=0.7477, over 11961.51 frames. ], batch size: 97, lr: 5.97e-03
2024-08-06 06:36:42,607 INFO [trainer.py:650] (5/8) Reaches end of dataloader.
2024-08-06 06:36:42,610 INFO [trainer.py:1069] (5/8) Done!